0s autopkgtest [11:51:28]: starting date and time: 2024-06-16 11:51:28+0000 0s autopkgtest [11:51:28]: git checkout: 433ed4cb Merge branch 'skia/nova_flock' into 'ubuntu/5.34+prod' 0s autopkgtest [11:51:28]: host juju-7f2275-prod-proposed-migration-environment-2; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.hzznhhe3/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:traitlets --apt-upgrade jupyter-notebook --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=traitlets/5.14.3-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-2@lcy02-131.secgroup --name adt-oracular-amd64-jupyter-notebook-20240616-102220-juju-7f2275-prod-proposed-migration-environment-2-e4388c01-ca29-4dd9-9535-fafa0350e03b --image adt/ubuntu-oracular-amd64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-2 --net-id=net_prod-proposed-migration -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 46s autopkgtest [11:52:14]: testbed dpkg architecture: amd64 46s autopkgtest [11:52:14]: testbed apt version: 2.9.3 46s autopkgtest [11:52:14]: @@@@@@@@@@@@@@@@@@@@ test bed setup 46s Get:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease [110 kB] 46s Get:2 http://ftpmaster.internal/ubuntu oracular-proposed/restricted Sources [7052 B] 46s Get:3 http://ftpmaster.internal/ubuntu oracular-proposed/main Sources [36.1 kB] 46s Get:4 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse Sources [2576 B] 46s Get:5 http://ftpmaster.internal/ubuntu oracular-proposed/universe Sources [389 kB] 46s Get:6 http://ftpmaster.internal/ubuntu oracular-proposed/main amd64 Packages [53.8 kB] 46s Get:7 http://ftpmaster.internal/ubuntu oracular-proposed/main i386 Packages [38.2 kB] 46s Get:8 http://ftpmaster.internal/ubuntu oracular-proposed/restricted i386 Packages [6732 B] 46s Get:9 http://ftpmaster.internal/ubuntu oracular-proposed/restricted amd64 Packages [28.9 kB] 46s Get:10 http://ftpmaster.internal/ubuntu oracular-proposed/universe amd64 Packages [316 kB] 46s Get:11 http://ftpmaster.internal/ubuntu oracular-proposed/universe i386 Packages [137 kB] 46s Get:12 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse i386 Packages [3884 B] 46s Get:13 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse amd64 Packages [8364 B] 46s Fetched 1138 kB in 0s (5347 kB/s) 46s Reading package lists... 47s Reading package lists... 48s Building dependency tree... 48s Reading state information... 48s Calculating upgrade... 48s The following packages will be upgraded: 48s apt apt-utils libapt-pkg6.0t64 libldap-common libldap2 48s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 48s Need to get 2875 kB of archives. 48s After this operation, 11.3 kB of additional disk space will be used. 48s Get:1 http://ftpmaster.internal/ubuntu oracular/main amd64 libapt-pkg6.0t64 amd64 2.9.5 [1023 kB] 48s Get:2 http://ftpmaster.internal/ubuntu oracular/main amd64 apt amd64 2.9.5 [1403 kB] 48s Get:3 http://ftpmaster.internal/ubuntu oracular/main amd64 apt-utils amd64 2.9.5 [223 kB] 48s Get:4 http://ftpmaster.internal/ubuntu oracular/main amd64 libldap-common all 2.6.7+dfsg-1~exp1ubuntu9 [31.5 kB] 48s Get:5 http://ftpmaster.internal/ubuntu oracular/main amd64 libldap2 amd64 2.6.7+dfsg-1~exp1ubuntu9 [195 kB] 48s Fetched 2875 kB in 0s (55.6 MB/s) 49s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 49s Preparing to unpack .../libapt-pkg6.0t64_2.9.5_amd64.deb ... 49s Unpacking libapt-pkg6.0t64:amd64 (2.9.5) over (2.9.3) ... 49s Setting up libapt-pkg6.0t64:amd64 (2.9.5) ... 49s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 49s Preparing to unpack .../archives/apt_2.9.5_amd64.deb ... 49s Unpacking apt (2.9.5) over (2.9.3) ... 49s Setting up apt (2.9.5) ... 50s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 50s Preparing to unpack .../apt-utils_2.9.5_amd64.deb ... 50s Unpacking apt-utils (2.9.5) over (2.9.3) ... 50s Preparing to unpack .../libldap-common_2.6.7+dfsg-1~exp1ubuntu9_all.deb ... 50s Unpacking libldap-common (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 50s Preparing to unpack .../libldap2_2.6.7+dfsg-1~exp1ubuntu9_amd64.deb ... 50s Unpacking libldap2:amd64 (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 50s Setting up apt-utils (2.9.5) ... 50s Setting up libldap-common (2.6.7+dfsg-1~exp1ubuntu9) ... 50s Setting up libldap2:amd64 (2.6.7+dfsg-1~exp1ubuntu9) ... 50s Processing triggers for man-db (2.12.1-2) ... 51s Processing triggers for libc-bin (2.39-0ubuntu9) ... 52s Reading package lists... 53s Building dependency tree... 53s Reading state information... 53s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 53s Hit:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease 53s Hit:2 http://ftpmaster.internal/ubuntu oracular InRelease 53s Hit:3 http://ftpmaster.internal/ubuntu oracular-updates InRelease 53s Hit:4 http://ftpmaster.internal/ubuntu oracular-security InRelease 54s Reading package lists... 54s Reading package lists... 54s Building dependency tree... 54s Reading state information... 54s Calculating upgrade... 55s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 55s Reading package lists... 55s Building dependency tree... 55s Reading state information... 55s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 55s autopkgtest [11:52:23]: rebooting testbed after setup commands that affected boot 58s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 69s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 73s autopkgtest [11:52:41]: testbed running kernel: Linux 6.8.0-31-generic #31-Ubuntu SMP PREEMPT_DYNAMIC Sat Apr 20 00:40:06 UTC 2024 73s autopkgtest [11:52:41]: @@@@@@@@@@@@@@@@@@@@ apt-source jupyter-notebook 74s Get:1 http://ftpmaster.internal/ubuntu oracular/universe jupyter-notebook 6.4.12-2.2ubuntu1 (dsc) [3886 B] 74s Get:2 http://ftpmaster.internal/ubuntu oracular/universe jupyter-notebook 6.4.12-2.2ubuntu1 (tar) [8501 kB] 74s Get:3 http://ftpmaster.internal/ubuntu oracular/universe jupyter-notebook 6.4.12-2.2ubuntu1 (diff) [49.6 kB] 75s gpgv: Signature made Thu Feb 15 18:11:52 2024 UTC 75s gpgv: using RSA key D09F8A854F1055BCFC482C4B23566B906047AFC8 75s gpgv: Can't check signature: No public key 75s dpkg-source: warning: cannot verify inline signature for ./jupyter-notebook_6.4.12-2.2ubuntu1.dsc: no acceptable signature found 75s autopkgtest [11:52:43]: testing package jupyter-notebook version 6.4.12-2.2ubuntu1 75s autopkgtest [11:52:43]: build not needed 75s autopkgtest [11:52:43]: test pytest: preparing testbed 76s Reading package lists... 76s Building dependency tree... 76s Reading state information... 76s Starting pkgProblemResolver with broken count: 0 76s Starting 2 pkgProblemResolver with broken count: 0 76s Done 77s The following additional packages will be installed: 77s fonts-font-awesome fonts-glyphicons-halflings fonts-lato fonts-mathjax gdb 77s jupyter-core jupyter-notebook libbabeltrace1 libdebuginfod-common 77s libdebuginfod1t64 libipt2 libjs-backbone libjs-bootstrap 77s libjs-bootstrap-tour libjs-codemirror libjs-es6-promise libjs-jed 77s libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 77s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 77s libjs-sphinxdoc libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 77s libpgm-5.3-0t64 libpython3.12t64 libsodium23 libsource-highlight-common 77s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 77s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 77s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 77s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 77s python3-executing python3-fastjsonschema python3-html5lib python3-iniconfig 77s python3-ipykernel python3-ipython python3-ipython-genutils python3-jedi 77s python3-jupyter-client python3-jupyter-core python3-jupyterlab-pygments 77s python3-matplotlib-inline python3-mistune python3-nbclient python3-nbconvert 77s python3-nbformat python3-nest-asyncio python3-notebook python3-packaging 77s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 77s python3-pluggy python3-prometheus-client python3-prompt-toolkit 77s python3-psutil python3-ptyprocess python3-pure-eval python3-py 77s python3-pydevd python3-pytest python3-requests-unixsocket python3-send2trash 77s python3-soupsieve python3-stack-data python3-terminado python3-tinycss2 77s python3-tornado python3-traitlets python3-typeshed python3-wcwidth 77s python3-webencodings python3-zmq sphinx-rtd-theme-common 77s Suggested packages: 77s gdb-doc gdbserver libjs-jquery-lazyload libjs-json libjs-jquery-ui-docs 77s fonts-mathjax-extras fonts-stix libjs-mathjax-doc python-argon2-doc 77s python-bleach-doc python-bytecode-doc python-coverage-doc 77s python-fastjsonschema-doc python3-genshi python3-lxml python-ipython-doc 77s python3-pip python-nbconvert-doc texlive-fonts-recommended 77s texlive-plain-generic texlive-xetex python-pexpect-doc subversion pydevd 77s python-terminado-doc python-tinycss2-doc python3-pycurl python-tornado-doc 77s python3-twisted 77s Recommended packages: 77s libc-dbg javascript-common python3-lxml python3-matplotlib pandoc 77s python3-ipywidgets 77s The following NEW packages will be installed: 77s autopkgtest-satdep fonts-font-awesome fonts-glyphicons-halflings fonts-lato 77s fonts-mathjax gdb jupyter-core jupyter-notebook libbabeltrace1 77s libdebuginfod-common libdebuginfod1t64 libipt2 libjs-backbone 77s libjs-bootstrap libjs-bootstrap-tour libjs-codemirror libjs-es6-promise 77s libjs-jed libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 77s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 77s libjs-sphinxdoc libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 77s libpgm-5.3-0t64 libpython3.12t64 libsodium23 libsource-highlight-common 77s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 77s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 77s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 77s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 77s python3-executing python3-fastjsonschema python3-html5lib python3-iniconfig 77s python3-ipykernel python3-ipython python3-ipython-genutils python3-jedi 77s python3-jupyter-client python3-jupyter-core python3-jupyterlab-pygments 77s python3-matplotlib-inline python3-mistune python3-nbclient python3-nbconvert 77s python3-nbformat python3-nest-asyncio python3-notebook python3-packaging 77s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 77s python3-pluggy python3-prometheus-client python3-prompt-toolkit 77s python3-psutil python3-ptyprocess python3-pure-eval python3-py 77s python3-pydevd python3-pytest python3-requests-unixsocket python3-send2trash 77s python3-soupsieve python3-stack-data python3-terminado python3-tinycss2 77s python3-tornado python3-traitlets python3-typeshed python3-wcwidth 77s python3-webencodings python3-zmq sphinx-rtd-theme-common 77s 0 upgraded, 97 newly installed, 0 to remove and 0 not upgraded. 77s Need to get 33.5 MB/33.5 MB of archives. 77s After this operation, 170 MB of additional disk space will be used. 77s Get:1 /tmp/autopkgtest.iPdHX2/1-autopkgtest-satdep.deb autopkgtest-satdep amd64 0 [748 B] 77s Get:2 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-lato all 2.015-1 [2781 kB] 77s Get:3 http://ftpmaster.internal/ubuntu oracular/main amd64 libdebuginfod-common all 0.191-1 [14.6 kB] 77s Get:4 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 77s Get:5 http://ftpmaster.internal/ubuntu oracular/universe amd64 fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-3 [118 kB] 77s Get:6 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-mathjax all 2.7.9+dfsg-1 [2208 kB] 77s Get:7 http://ftpmaster.internal/ubuntu oracular/main amd64 libbabeltrace1 amd64 1.5.11-3build3 [164 kB] 77s Get:8 http://ftpmaster.internal/ubuntu oracular/main amd64 libdebuginfod1t64 amd64 0.191-1 [17.1 kB] 77s Get:9 http://ftpmaster.internal/ubuntu oracular/main amd64 libipt2 amd64 2.0.6-1build1 [45.7 kB] 77s Get:10 http://ftpmaster.internal/ubuntu oracular/main amd64 libpython3.12t64 amd64 3.12.4-1 [2338 kB] 77s Get:11 http://ftpmaster.internal/ubuntu oracular/main amd64 libsource-highlight-common all 3.1.9-4.3build1 [64.2 kB] 77s Get:12 http://ftpmaster.internal/ubuntu oracular/main amd64 libsource-highlight4t64 amd64 3.1.9-4.3build1 [258 kB] 77s Get:13 http://ftpmaster.internal/ubuntu oracular/main amd64 gdb amd64 15.0.50.20240403-0ubuntu1 [4010 kB] 77s Get:14 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-platformdirs all 4.2.1-1 [16.3 kB] 77s Get:15 http://ftpmaster.internal/ubuntu oracular-proposed/universe amd64 python3-traitlets all 5.14.3-1 [71.3 kB] 77s Get:16 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyter-core all 5.3.2-2 [25.5 kB] 77s Get:17 http://ftpmaster.internal/ubuntu oracular/universe amd64 jupyter-core all 5.3.2-2 [4038 B] 77s Get:18 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 77s Get:19 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-backbone all 1.4.1~dfsg+~1.4.15-3 [185 kB] 77s Get:20 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-bootstrap all 3.4.1+dfsg-3 [129 kB] 77s Get:21 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 77s Get:22 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-bootstrap-tour all 0.12.0+dfsg-5 [21.4 kB] 77s Get:23 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-codemirror all 5.65.0+~cs5.83.9-3 [755 kB] 77s Get:24 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-es6-promise all 4.2.8-12 [14.1 kB] 77s Get:25 http://ftpmaster.internal/ubuntu oracular/universe amd64 node-jed all 1.1.1-4 [15.2 kB] 77s Get:26 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jed all 1.1.1-4 [2584 B] 77s Get:27 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jquery-typeahead all 2.11.0+dfsg1-3 [48.9 kB] 77s Get:28 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jquery-ui all 1.13.2+dfsg-1 [252 kB] 77s Get:29 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-marked all 4.2.3+ds+~4.0.7-3 [36.2 kB] 77s Get:30 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-mathjax all 2.7.9+dfsg-1 [5665 kB] 77s Get:31 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-moment all 2.29.4+ds-1 [147 kB] 77s Get:32 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-requirejs all 2.3.6+ds+~2.1.37-1 [201 kB] 77s Get:33 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-requirejs-text all 2.0.12-1.1 [9056 B] 77s Get:34 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-text-encoding all 0.7.0-5 [140 kB] 77s Get:35 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-xterm all 5.3.0-2 [476 kB] 77s Get:36 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-ptyprocess all 0.7.0-5 [15.1 kB] 77s Get:37 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-tornado amd64 6.4.1-1 [298 kB] 77s Get:38 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-terminado all 0.18.1-1 [13.2 kB] 77s Get:39 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-argon2 amd64 21.1.0-2build1 [21.0 kB] 77s Get:40 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-comm all 0.2.1-1 [7016 B] 77s Get:41 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-bytecode all 0.15.1-3 [44.7 kB] 77s Get:42 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-coverage amd64 7.4.4+dfsg1-0ubuntu2 [147 kB] 77s Get:43 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pydevd amd64 2.10.0+ds-10ubuntu1 [637 kB] 77s Get:44 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-debugpy all 1.8.0+ds-4ubuntu4 [67.6 kB] 77s Get:45 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-decorator all 5.1.1-5 [10.1 kB] 77s Get:46 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-parso all 0.8.3-1 [67.2 kB] 77s Get:47 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-typeshed all 0.0~git20231111.6764465-3 [1274 kB] 77s Get:48 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jedi all 0.19.1+ds1-1 [693 kB] 77s Get:49 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-matplotlib-inline all 0.1.6-2 [8784 B] 77s Get:50 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-pexpect all 4.9-2 [48.1 kB] 77s Get:51 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-wcwidth all 0.2.5+dfsg1-1.1ubuntu1 [22.5 kB] 77s Get:52 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-prompt-toolkit all 3.0.46-1 [256 kB] 77s Get:53 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-asttokens all 2.4.1-1 [20.9 kB] 77s Get:54 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-executing all 2.0.1-0.1 [23.3 kB] 77s Get:55 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pure-eval all 0.2.2-2 [11.1 kB] 77s Get:56 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-stack-data all 0.6.3-1 [22.0 kB] 77s Get:57 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipython all 8.20.0-1ubuntu1 [561 kB] 77s Get:58 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-dateutil all 2.9.0-2 [80.3 kB] 77s Get:59 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-entrypoints all 0.4-2 [7146 B] 77s Get:60 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nest-asyncio all 1.5.4-1 [6256 B] 77s Get:61 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-py all 1.11.0-2 [72.7 kB] 77s Get:62 http://ftpmaster.internal/ubuntu oracular/universe amd64 libnorm1t64 amd64 1.5.9+dfsg-3.1build1 [154 kB] 77s Get:63 http://ftpmaster.internal/ubuntu oracular/universe amd64 libpgm-5.3-0t64 amd64 5.3.128~dfsg-2.1build1 [167 kB] 77s Get:64 http://ftpmaster.internal/ubuntu oracular/main amd64 libsodium23 amd64 1.0.18-1build3 [161 kB] 77s Get:65 http://ftpmaster.internal/ubuntu oracular/universe amd64 libzmq5 amd64 4.3.5-1build2 [260 kB] 77s Get:66 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-zmq amd64 24.0.1-5build1 [286 kB] 77s Get:67 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyter-client all 7.4.9-2ubuntu1 [90.5 kB] 77s Get:68 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-packaging all 24.0-1 [41.1 kB] 77s Get:69 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-psutil amd64 5.9.8-2build2 [195 kB] 77s Get:70 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipykernel all 6.29.3-1ubuntu1 [82.6 kB] 77s Get:71 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipython-genutils all 0.2.0-6 [22.0 kB] 77s Get:72 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-webencodings all 0.5.1-5 [11.5 kB] 77s Get:73 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-html5lib all 1.1-6 [88.8 kB] 77s Get:74 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-bleach all 6.1.0-2 [49.6 kB] 77s Get:75 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-soupsieve all 2.5-1 [33.0 kB] 77s Get:76 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-bs4 all 4.12.3-1 [109 kB] 77s Get:77 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-defusedxml all 0.7.1-2 [42.0 kB] 77s Get:78 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyterlab-pygments all 0.2.2-3 [6054 B] 77s Get:79 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-mistune all 3.0.2-1 [32.8 kB] 77s Get:80 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-fastjsonschema all 2.19.1-1 [19.7 kB] 77s Get:81 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbformat all 5.9.1-1 [41.2 kB] 77s Get:82 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbclient all 0.8.0-1 [55.6 kB] 77s Get:83 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pandocfilters all 1.5.1-1 [23.6 kB] 77s Get:84 http://ftpmaster.internal/ubuntu oracular/universe amd64 python-tinycss2-common all 1.3.0-1 [34.1 kB] 77s Get:85 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-tinycss2 all 1.3.0-1 [19.6 kB] 77s Get:86 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbconvert all 7.16.4-1 [156 kB] 77s Get:87 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-prometheus-client all 0.19.0+ds1-1 [41.7 kB] 77s Get:88 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-send2trash all 1.8.2-1 [15.5 kB] 77s Get:89 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-notebook all 6.4.12-2.2ubuntu1 [1566 kB] 77s Get:90 http://ftpmaster.internal/ubuntu oracular/universe amd64 jupyter-notebook all 6.4.12-2.2ubuntu1 [10.4 kB] 77s Get:91 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-sphinxdoc all 7.2.6-8 [150 kB] 77s Get:92 http://ftpmaster.internal/ubuntu oracular/main amd64 sphinx-rtd-theme-common all 2.0.0+dfsg-1 [1012 kB] 77s Get:93 http://ftpmaster.internal/ubuntu oracular/universe amd64 python-notebook-doc all 6.4.12-2.2ubuntu1 [2540 kB] 77s Get:94 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 77s Get:95 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pluggy all 1.5.0-1 [21.0 kB] 77s Get:96 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pytest all 7.4.4-1 [305 kB] 77s Get:97 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-requests-unixsocket all 0.3.0-4 [7274 B] 78s Preconfiguring packages ... 78s Fetched 33.5 MB in 0s (79.8 MB/s) 78s Selecting previously unselected package fonts-lato. 78s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 78s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 78s Unpacking fonts-lato (2.015-1) ... 78s Selecting previously unselected package libdebuginfod-common. 78s Preparing to unpack .../01-libdebuginfod-common_0.191-1_all.deb ... 78s Unpacking libdebuginfod-common (0.191-1) ... 78s Selecting previously unselected package fonts-font-awesome. 78s Preparing to unpack .../02-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 78s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 78s Selecting previously unselected package fonts-glyphicons-halflings. 78s Preparing to unpack .../03-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-3_all.deb ... 78s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 78s Selecting previously unselected package fonts-mathjax. 78s Preparing to unpack .../04-fonts-mathjax_2.7.9+dfsg-1_all.deb ... 78s Unpacking fonts-mathjax (2.7.9+dfsg-1) ... 79s Selecting previously unselected package libbabeltrace1:amd64. 79s Preparing to unpack .../05-libbabeltrace1_1.5.11-3build3_amd64.deb ... 79s Unpacking libbabeltrace1:amd64 (1.5.11-3build3) ... 79s Selecting previously unselected package libdebuginfod1t64:amd64. 79s Preparing to unpack .../06-libdebuginfod1t64_0.191-1_amd64.deb ... 79s Unpacking libdebuginfod1t64:amd64 (0.191-1) ... 79s Selecting previously unselected package libipt2. 79s Preparing to unpack .../07-libipt2_2.0.6-1build1_amd64.deb ... 79s Unpacking libipt2 (2.0.6-1build1) ... 79s Selecting previously unselected package libpython3.12t64:amd64. 79s Preparing to unpack .../08-libpython3.12t64_3.12.4-1_amd64.deb ... 79s Unpacking libpython3.12t64:amd64 (3.12.4-1) ... 79s Selecting previously unselected package libsource-highlight-common. 79s Preparing to unpack .../09-libsource-highlight-common_3.1.9-4.3build1_all.deb ... 79s Unpacking libsource-highlight-common (3.1.9-4.3build1) ... 79s Selecting previously unselected package libsource-highlight4t64:amd64. 79s Preparing to unpack .../10-libsource-highlight4t64_3.1.9-4.3build1_amd64.deb ... 79s Unpacking libsource-highlight4t64:amd64 (3.1.9-4.3build1) ... 79s Selecting previously unselected package gdb. 79s Preparing to unpack .../11-gdb_15.0.50.20240403-0ubuntu1_amd64.deb ... 79s Unpacking gdb (15.0.50.20240403-0ubuntu1) ... 79s Selecting previously unselected package python3-platformdirs. 79s Preparing to unpack .../12-python3-platformdirs_4.2.1-1_all.deb ... 79s Unpacking python3-platformdirs (4.2.1-1) ... 79s Selecting previously unselected package python3-traitlets. 79s Preparing to unpack .../13-python3-traitlets_5.14.3-1_all.deb ... 79s Unpacking python3-traitlets (5.14.3-1) ... 79s Selecting previously unselected package python3-jupyter-core. 79s Preparing to unpack .../14-python3-jupyter-core_5.3.2-2_all.deb ... 79s Unpacking python3-jupyter-core (5.3.2-2) ... 79s Selecting previously unselected package jupyter-core. 79s Preparing to unpack .../15-jupyter-core_5.3.2-2_all.deb ... 79s Unpacking jupyter-core (5.3.2-2) ... 79s Selecting previously unselected package libjs-underscore. 79s Preparing to unpack .../16-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 79s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 79s Selecting previously unselected package libjs-backbone. 79s Preparing to unpack .../17-libjs-backbone_1.4.1~dfsg+~1.4.15-3_all.deb ... 79s Unpacking libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 79s Selecting previously unselected package libjs-bootstrap. 79s Preparing to unpack .../18-libjs-bootstrap_3.4.1+dfsg-3_all.deb ... 79s Unpacking libjs-bootstrap (3.4.1+dfsg-3) ... 79s Selecting previously unselected package libjs-jquery. 79s Preparing to unpack .../19-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 79s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 79s Selecting previously unselected package libjs-bootstrap-tour. 79s Preparing to unpack .../20-libjs-bootstrap-tour_0.12.0+dfsg-5_all.deb ... 79s Unpacking libjs-bootstrap-tour (0.12.0+dfsg-5) ... 79s Selecting previously unselected package libjs-codemirror. 79s Preparing to unpack .../21-libjs-codemirror_5.65.0+~cs5.83.9-3_all.deb ... 79s Unpacking libjs-codemirror (5.65.0+~cs5.83.9-3) ... 79s Selecting previously unselected package libjs-es6-promise. 79s Preparing to unpack .../22-libjs-es6-promise_4.2.8-12_all.deb ... 79s Unpacking libjs-es6-promise (4.2.8-12) ... 79s Selecting previously unselected package node-jed. 79s Preparing to unpack .../23-node-jed_1.1.1-4_all.deb ... 79s Unpacking node-jed (1.1.1-4) ... 79s Selecting previously unselected package libjs-jed. 79s Preparing to unpack .../24-libjs-jed_1.1.1-4_all.deb ... 79s Unpacking libjs-jed (1.1.1-4) ... 79s Selecting previously unselected package libjs-jquery-typeahead. 79s Preparing to unpack .../25-libjs-jquery-typeahead_2.11.0+dfsg1-3_all.deb ... 79s Unpacking libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 79s Selecting previously unselected package libjs-jquery-ui. 79s Preparing to unpack .../26-libjs-jquery-ui_1.13.2+dfsg-1_all.deb ... 79s Unpacking libjs-jquery-ui (1.13.2+dfsg-1) ... 79s Selecting previously unselected package libjs-marked. 79s Preparing to unpack .../27-libjs-marked_4.2.3+ds+~4.0.7-3_all.deb ... 79s Unpacking libjs-marked (4.2.3+ds+~4.0.7-3) ... 79s Selecting previously unselected package libjs-mathjax. 80s Preparing to unpack .../28-libjs-mathjax_2.7.9+dfsg-1_all.deb ... 80s Unpacking libjs-mathjax (2.7.9+dfsg-1) ... 80s Selecting previously unselected package libjs-moment. 80s Preparing to unpack .../29-libjs-moment_2.29.4+ds-1_all.deb ... 80s Unpacking libjs-moment (2.29.4+ds-1) ... 80s Selecting previously unselected package libjs-requirejs. 80s Preparing to unpack .../30-libjs-requirejs_2.3.6+ds+~2.1.37-1_all.deb ... 80s Unpacking libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 80s Selecting previously unselected package libjs-requirejs-text. 80s Preparing to unpack .../31-libjs-requirejs-text_2.0.12-1.1_all.deb ... 80s Unpacking libjs-requirejs-text (2.0.12-1.1) ... 80s Selecting previously unselected package libjs-text-encoding. 80s Preparing to unpack .../32-libjs-text-encoding_0.7.0-5_all.deb ... 80s Unpacking libjs-text-encoding (0.7.0-5) ... 80s Selecting previously unselected package libjs-xterm. 80s Preparing to unpack .../33-libjs-xterm_5.3.0-2_all.deb ... 80s Unpacking libjs-xterm (5.3.0-2) ... 80s Selecting previously unselected package python3-ptyprocess. 80s Preparing to unpack .../34-python3-ptyprocess_0.7.0-5_all.deb ... 80s Unpacking python3-ptyprocess (0.7.0-5) ... 80s Selecting previously unselected package python3-tornado. 80s Preparing to unpack .../35-python3-tornado_6.4.1-1_amd64.deb ... 80s Unpacking python3-tornado (6.4.1-1) ... 80s Selecting previously unselected package python3-terminado. 80s Preparing to unpack .../36-python3-terminado_0.18.1-1_all.deb ... 80s Unpacking python3-terminado (0.18.1-1) ... 80s Selecting previously unselected package python3-argon2. 81s Preparing to unpack .../37-python3-argon2_21.1.0-2build1_amd64.deb ... 81s Unpacking python3-argon2 (21.1.0-2build1) ... 81s Selecting previously unselected package python3-comm. 81s Preparing to unpack .../38-python3-comm_0.2.1-1_all.deb ... 81s Unpacking python3-comm (0.2.1-1) ... 81s Selecting previously unselected package python3-bytecode. 81s Preparing to unpack .../39-python3-bytecode_0.15.1-3_all.deb ... 81s Unpacking python3-bytecode (0.15.1-3) ... 81s Selecting previously unselected package python3-coverage. 81s Preparing to unpack .../40-python3-coverage_7.4.4+dfsg1-0ubuntu2_amd64.deb ... 81s Unpacking python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 81s Selecting previously unselected package python3-pydevd. 81s Preparing to unpack .../41-python3-pydevd_2.10.0+ds-10ubuntu1_amd64.deb ... 81s Unpacking python3-pydevd (2.10.0+ds-10ubuntu1) ... 81s Selecting previously unselected package python3-debugpy. 81s Preparing to unpack .../42-python3-debugpy_1.8.0+ds-4ubuntu4_all.deb ... 81s Unpacking python3-debugpy (1.8.0+ds-4ubuntu4) ... 81s Selecting previously unselected package python3-decorator. 81s Preparing to unpack .../43-python3-decorator_5.1.1-5_all.deb ... 81s Unpacking python3-decorator (5.1.1-5) ... 81s Selecting previously unselected package python3-parso. 81s Preparing to unpack .../44-python3-parso_0.8.3-1_all.deb ... 81s Unpacking python3-parso (0.8.3-1) ... 81s Selecting previously unselected package python3-typeshed. 81s Preparing to unpack .../45-python3-typeshed_0.0~git20231111.6764465-3_all.deb ... 81s Unpacking python3-typeshed (0.0~git20231111.6764465-3) ... 81s Selecting previously unselected package python3-jedi. 81s Preparing to unpack .../46-python3-jedi_0.19.1+ds1-1_all.deb ... 81s Unpacking python3-jedi (0.19.1+ds1-1) ... 82s Selecting previously unselected package python3-matplotlib-inline. 82s Preparing to unpack .../47-python3-matplotlib-inline_0.1.6-2_all.deb ... 82s Unpacking python3-matplotlib-inline (0.1.6-2) ... 82s Selecting previously unselected package python3-pexpect. 82s Preparing to unpack .../48-python3-pexpect_4.9-2_all.deb ... 82s Unpacking python3-pexpect (4.9-2) ... 82s Selecting previously unselected package python3-wcwidth. 82s Preparing to unpack .../49-python3-wcwidth_0.2.5+dfsg1-1.1ubuntu1_all.deb ... 82s Unpacking python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 82s Selecting previously unselected package python3-prompt-toolkit. 82s Preparing to unpack .../50-python3-prompt-toolkit_3.0.46-1_all.deb ... 82s Unpacking python3-prompt-toolkit (3.0.46-1) ... 82s Selecting previously unselected package python3-asttokens. 82s Preparing to unpack .../51-python3-asttokens_2.4.1-1_all.deb ... 82s Unpacking python3-asttokens (2.4.1-1) ... 82s Selecting previously unselected package python3-executing. 82s Preparing to unpack .../52-python3-executing_2.0.1-0.1_all.deb ... 82s Unpacking python3-executing (2.0.1-0.1) ... 82s Selecting previously unselected package python3-pure-eval. 82s Preparing to unpack .../53-python3-pure-eval_0.2.2-2_all.deb ... 82s Unpacking python3-pure-eval (0.2.2-2) ... 82s Selecting previously unselected package python3-stack-data. 82s Preparing to unpack .../54-python3-stack-data_0.6.3-1_all.deb ... 82s Unpacking python3-stack-data (0.6.3-1) ... 82s Selecting previously unselected package python3-ipython. 82s Preparing to unpack .../55-python3-ipython_8.20.0-1ubuntu1_all.deb ... 82s Unpacking python3-ipython (8.20.0-1ubuntu1) ... 82s Selecting previously unselected package python3-dateutil. 82s Preparing to unpack .../56-python3-dateutil_2.9.0-2_all.deb ... 82s Unpacking python3-dateutil (2.9.0-2) ... 82s Selecting previously unselected package python3-entrypoints. 82s Preparing to unpack .../57-python3-entrypoints_0.4-2_all.deb ... 82s Unpacking python3-entrypoints (0.4-2) ... 82s Selecting previously unselected package python3-nest-asyncio. 82s Preparing to unpack .../58-python3-nest-asyncio_1.5.4-1_all.deb ... 82s Unpacking python3-nest-asyncio (1.5.4-1) ... 82s Selecting previously unselected package python3-py. 82s Preparing to unpack .../59-python3-py_1.11.0-2_all.deb ... 82s Unpacking python3-py (1.11.0-2) ... 82s Selecting previously unselected package libnorm1t64:amd64. 82s Preparing to unpack .../60-libnorm1t64_1.5.9+dfsg-3.1build1_amd64.deb ... 82s Unpacking libnorm1t64:amd64 (1.5.9+dfsg-3.1build1) ... 82s Selecting previously unselected package libpgm-5.3-0t64:amd64. 82s Preparing to unpack .../61-libpgm-5.3-0t64_5.3.128~dfsg-2.1build1_amd64.deb ... 82s Unpacking libpgm-5.3-0t64:amd64 (5.3.128~dfsg-2.1build1) ... 82s Selecting previously unselected package libsodium23:amd64. 82s Preparing to unpack .../62-libsodium23_1.0.18-1build3_amd64.deb ... 82s Unpacking libsodium23:amd64 (1.0.18-1build3) ... 82s Selecting previously unselected package libzmq5:amd64. 82s Preparing to unpack .../63-libzmq5_4.3.5-1build2_amd64.deb ... 82s Unpacking libzmq5:amd64 (4.3.5-1build2) ... 82s Selecting previously unselected package python3-zmq. 82s Preparing to unpack .../64-python3-zmq_24.0.1-5build1_amd64.deb ... 82s Unpacking python3-zmq (24.0.1-5build1) ... 82s Selecting previously unselected package python3-jupyter-client. 82s Preparing to unpack .../65-python3-jupyter-client_7.4.9-2ubuntu1_all.deb ... 82s Unpacking python3-jupyter-client (7.4.9-2ubuntu1) ... 82s Selecting previously unselected package python3-packaging. 82s Preparing to unpack .../66-python3-packaging_24.0-1_all.deb ... 82s Unpacking python3-packaging (24.0-1) ... 82s Selecting previously unselected package python3-psutil. 82s Preparing to unpack .../67-python3-psutil_5.9.8-2build2_amd64.deb ... 82s Unpacking python3-psutil (5.9.8-2build2) ... 82s Selecting previously unselected package python3-ipykernel. 82s Preparing to unpack .../68-python3-ipykernel_6.29.3-1ubuntu1_all.deb ... 82s Unpacking python3-ipykernel (6.29.3-1ubuntu1) ... 82s Selecting previously unselected package python3-ipython-genutils. 82s Preparing to unpack .../69-python3-ipython-genutils_0.2.0-6_all.deb ... 82s Unpacking python3-ipython-genutils (0.2.0-6) ... 82s Selecting previously unselected package python3-webencodings. 82s Preparing to unpack .../70-python3-webencodings_0.5.1-5_all.deb ... 82s Unpacking python3-webencodings (0.5.1-5) ... 82s Selecting previously unselected package python3-html5lib. 82s Preparing to unpack .../71-python3-html5lib_1.1-6_all.deb ... 82s Unpacking python3-html5lib (1.1-6) ... 82s Selecting previously unselected package python3-bleach. 82s Preparing to unpack .../72-python3-bleach_6.1.0-2_all.deb ... 82s Unpacking python3-bleach (6.1.0-2) ... 82s Selecting previously unselected package python3-soupsieve. 82s Preparing to unpack .../73-python3-soupsieve_2.5-1_all.deb ... 82s Unpacking python3-soupsieve (2.5-1) ... 82s Selecting previously unselected package python3-bs4. 83s Preparing to unpack .../74-python3-bs4_4.12.3-1_all.deb ... 83s Unpacking python3-bs4 (4.12.3-1) ... 83s Selecting previously unselected package python3-defusedxml. 83s Preparing to unpack .../75-python3-defusedxml_0.7.1-2_all.deb ... 83s Unpacking python3-defusedxml (0.7.1-2) ... 83s Selecting previously unselected package python3-jupyterlab-pygments. 83s Preparing to unpack .../76-python3-jupyterlab-pygments_0.2.2-3_all.deb ... 83s Unpacking python3-jupyterlab-pygments (0.2.2-3) ... 83s Selecting previously unselected package python3-mistune. 83s Preparing to unpack .../77-python3-mistune_3.0.2-1_all.deb ... 83s Unpacking python3-mistune (3.0.2-1) ... 83s Selecting previously unselected package python3-fastjsonschema. 83s Preparing to unpack .../78-python3-fastjsonschema_2.19.1-1_all.deb ... 83s Unpacking python3-fastjsonschema (2.19.1-1) ... 83s Selecting previously unselected package python3-nbformat. 83s Preparing to unpack .../79-python3-nbformat_5.9.1-1_all.deb ... 83s Unpacking python3-nbformat (5.9.1-1) ... 83s Selecting previously unselected package python3-nbclient. 83s Preparing to unpack .../80-python3-nbclient_0.8.0-1_all.deb ... 83s Unpacking python3-nbclient (0.8.0-1) ... 83s Selecting previously unselected package python3-pandocfilters. 83s Preparing to unpack .../81-python3-pandocfilters_1.5.1-1_all.deb ... 83s Unpacking python3-pandocfilters (1.5.1-1) ... 83s Selecting previously unselected package python-tinycss2-common. 83s Preparing to unpack .../82-python-tinycss2-common_1.3.0-1_all.deb ... 83s Unpacking python-tinycss2-common (1.3.0-1) ... 83s Selecting previously unselected package python3-tinycss2. 83s Preparing to unpack .../83-python3-tinycss2_1.3.0-1_all.deb ... 83s Unpacking python3-tinycss2 (1.3.0-1) ... 83s Selecting previously unselected package python3-nbconvert. 83s Preparing to unpack .../84-python3-nbconvert_7.16.4-1_all.deb ... 83s Unpacking python3-nbconvert (7.16.4-1) ... 83s Selecting previously unselected package python3-prometheus-client. 83s Preparing to unpack .../85-python3-prometheus-client_0.19.0+ds1-1_all.deb ... 83s Unpacking python3-prometheus-client (0.19.0+ds1-1) ... 83s Selecting previously unselected package python3-send2trash. 83s Preparing to unpack .../86-python3-send2trash_1.8.2-1_all.deb ... 83s Unpacking python3-send2trash (1.8.2-1) ... 83s Selecting previously unselected package python3-notebook. 83s Preparing to unpack .../87-python3-notebook_6.4.12-2.2ubuntu1_all.deb ... 83s Unpacking python3-notebook (6.4.12-2.2ubuntu1) ... 83s Selecting previously unselected package jupyter-notebook. 83s Preparing to unpack .../88-jupyter-notebook_6.4.12-2.2ubuntu1_all.deb ... 83s Unpacking jupyter-notebook (6.4.12-2.2ubuntu1) ... 83s Selecting previously unselected package libjs-sphinxdoc. 83s Preparing to unpack .../89-libjs-sphinxdoc_7.2.6-8_all.deb ... 83s Unpacking libjs-sphinxdoc (7.2.6-8) ... 83s Selecting previously unselected package sphinx-rtd-theme-common. 83s Preparing to unpack .../90-sphinx-rtd-theme-common_2.0.0+dfsg-1_all.deb ... 83s Unpacking sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 83s Selecting previously unselected package python-notebook-doc. 83s Preparing to unpack .../91-python-notebook-doc_6.4.12-2.2ubuntu1_all.deb ... 83s Unpacking python-notebook-doc (6.4.12-2.2ubuntu1) ... 83s Selecting previously unselected package python3-iniconfig. 83s Preparing to unpack .../92-python3-iniconfig_1.1.1-2_all.deb ... 83s Unpacking python3-iniconfig (1.1.1-2) ... 83s Selecting previously unselected package python3-pluggy. 83s Preparing to unpack .../93-python3-pluggy_1.5.0-1_all.deb ... 83s Unpacking python3-pluggy (1.5.0-1) ... 83s Selecting previously unselected package python3-pytest. 83s Preparing to unpack .../94-python3-pytest_7.4.4-1_all.deb ... 83s Unpacking python3-pytest (7.4.4-1) ... 83s Selecting previously unselected package python3-requests-unixsocket. 83s Preparing to unpack .../95-python3-requests-unixsocket_0.3.0-4_all.deb ... 83s Unpacking python3-requests-unixsocket (0.3.0-4) ... 83s Selecting previously unselected package autopkgtest-satdep. 83s Preparing to unpack .../96-1-autopkgtest-satdep.deb ... 83s Unpacking autopkgtest-satdep (0) ... 83s Setting up python3-entrypoints (0.4-2) ... 83s Setting up libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 83s Setting up python3-iniconfig (1.1.1-2) ... 84s Setting up python3-tornado (6.4.1-1) ... 84s Setting up libnorm1t64:amd64 (1.5.9+dfsg-3.1build1) ... 84s Setting up python3-pure-eval (0.2.2-2) ... 84s Setting up python3-send2trash (1.8.2-1) ... 84s Setting up fonts-lato (2.015-1) ... 84s Setting up fonts-mathjax (2.7.9+dfsg-1) ... 84s Setting up libsodium23:amd64 (1.0.18-1build3) ... 84s Setting up libjs-mathjax (2.7.9+dfsg-1) ... 84s Setting up python3-py (1.11.0-2) ... 84s Setting up libdebuginfod-common (0.191-1) ... 84s Setting up libjs-requirejs-text (2.0.12-1.1) ... 84s Setting up python3-parso (0.8.3-1) ... 85s Setting up python3-defusedxml (0.7.1-2) ... 85s Setting up python3-ipython-genutils (0.2.0-6) ... 85s Setting up python3-asttokens (2.4.1-1) ... 85s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 85s Setting up python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 85s Setting up libjs-moment (2.29.4+ds-1) ... 85s Setting up python3-pandocfilters (1.5.1-1) ... 85s Setting up libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 85s Setting up libjs-es6-promise (4.2.8-12) ... 85s Setting up libjs-text-encoding (0.7.0-5) ... 85s Setting up python3-webencodings (0.5.1-5) ... 85s Setting up python3-platformdirs (4.2.1-1) ... 85s Setting up python3-psutil (5.9.8-2build2) ... 86s Setting up libsource-highlight-common (3.1.9-4.3build1) ... 86s Setting up python3-requests-unixsocket (0.3.0-4) ... 86s Setting up python3-jupyterlab-pygments (0.2.2-3) ... 86s Setting up libpython3.12t64:amd64 (3.12.4-1) ... 86s Setting up libpgm-5.3-0t64:amd64 (5.3.128~dfsg-2.1build1) ... 86s Setting up python3-decorator (5.1.1-5) ... 86s Setting up python3-packaging (24.0-1) ... 86s Setting up python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 86s Setting up node-jed (1.1.1-4) ... 86s Setting up python3-typeshed (0.0~git20231111.6764465-3) ... 86s Setting up python3-executing (2.0.1-0.1) ... 86s Setting up libjs-xterm (5.3.0-2) ... 86s Setting up python3-nest-asyncio (1.5.4-1) ... 86s Setting up python3-bytecode (0.15.1-3) ... 87s Setting up libjs-codemirror (5.65.0+~cs5.83.9-3) ... 87s Setting up libjs-jed (1.1.1-4) ... 87s Setting up libipt2 (2.0.6-1build1) ... 87s Setting up python3-html5lib (1.1-6) ... 87s Setting up libbabeltrace1:amd64 (1.5.11-3build3) ... 87s Setting up python3-pluggy (1.5.0-1) ... 87s Setting up python3-fastjsonschema (2.19.1-1) ... 87s Setting up python3-traitlets (5.14.3-1) ... 87s Setting up python-tinycss2-common (1.3.0-1) ... 87s Setting up python3-argon2 (21.1.0-2build1) ... 87s Setting up python3-dateutil (2.9.0-2) ... 87s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 87s Setting up python3-mistune (3.0.2-1) ... 87s Setting up python3-stack-data (0.6.3-1) ... 88s Setting up python3-soupsieve (2.5-1) ... 88s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 88s Setting up sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 88s Setting up python3-jupyter-core (5.3.2-2) ... 88s Setting up libjs-bootstrap (3.4.1+dfsg-3) ... 88s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 88s Setting up python3-ptyprocess (0.7.0-5) ... 88s Setting up libjs-marked (4.2.3+ds+~4.0.7-3) ... 88s Setting up python3-prompt-toolkit (3.0.46-1) ... 88s Setting up libdebuginfod1t64:amd64 (0.191-1) ... 88s Setting up python3-tinycss2 (1.3.0-1) ... 88s Setting up libzmq5:amd64 (4.3.5-1build2) ... 88s Setting up python3-jedi (0.19.1+ds1-1) ... 89s Setting up python3-pytest (7.4.4-1) ... 89s Setting up libjs-bootstrap-tour (0.12.0+dfsg-5) ... 89s Setting up libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 89s Setting up libsource-highlight4t64:amd64 (3.1.9-4.3build1) ... 89s Setting up python3-nbformat (5.9.1-1) ... 89s Setting up python3-bs4 (4.12.3-1) ... 89s Setting up python3-bleach (6.1.0-2) ... 89s Setting up python3-matplotlib-inline (0.1.6-2) ... 89s Setting up python3-comm (0.2.1-1) ... 90s Setting up python3-prometheus-client (0.19.0+ds1-1) ... 90s Setting up gdb (15.0.50.20240403-0ubuntu1) ... 90s Setting up libjs-jquery-ui (1.13.2+dfsg-1) ... 90s Setting up python3-pexpect (4.9-2) ... 90s Setting up python3-zmq (24.0.1-5build1) ... 90s Setting up libjs-sphinxdoc (7.2.6-8) ... 90s Setting up python3-terminado (0.18.1-1) ... 90s Setting up python3-jupyter-client (7.4.9-2ubuntu1) ... 90s Setting up jupyter-core (5.3.2-2) ... 90s Setting up python3-pydevd (2.10.0+ds-10ubuntu1) ... 91s Setting up python3-debugpy (1.8.0+ds-4ubuntu4) ... 91s Setting up python-notebook-doc (6.4.12-2.2ubuntu1) ... 91s Setting up python3-nbclient (0.8.0-1) ... 91s Setting up python3-ipython (8.20.0-1ubuntu1) ... 92s Setting up python3-ipykernel (6.29.3-1ubuntu1) ... 92s Setting up python3-nbconvert (7.16.4-1) ... 92s Setting up python3-notebook (6.4.12-2.2ubuntu1) ... 92s Setting up jupyter-notebook (6.4.12-2.2ubuntu1) ... 92s Setting up autopkgtest-satdep (0) ... 92s Processing triggers for man-db (2.12.1-2) ... 93s Processing triggers for libc-bin (2.39-0ubuntu9) ... 96s (Reading database ... 91052 files and directories currently installed.) 96s Removing autopkgtest-satdep (0) ... 96s autopkgtest [11:53:04]: test pytest: [----------------------- 97s ============================= test session starts ============================== 97s platform linux -- Python 3.12.4, pytest-7.4.4, pluggy-1.5.0 97s rootdir: /tmp/autopkgtest.iPdHX2/build.Zbq/src 97s collected 330 items / 5 deselected / 325 selected 97s 98s notebook/auth/tests/test_login.py EE [ 0%] 98s notebook/auth/tests/test_security.py .... [ 1%] 99s notebook/bundler/tests/test_bundler_api.py EEEEE [ 3%] 99s notebook/bundler/tests/test_bundler_tools.py ............. [ 7%] 99s notebook/bundler/tests/test_bundlerextension.py ... [ 8%] 99s notebook/nbconvert/tests/test_nbconvert_handlers.py ssssss [ 10%] 100s notebook/services/api/tests/test_api.py EEE [ 11%] 100s notebook/services/config/tests/test_config_api.py EEE [ 12%] 101s notebook/services/contents/tests/test_contents_api.py EsEEEEEEEEEEssEEsE [ 17%] 109s EEEEEEEEEEEEEEEEEEEEEEEEEsEEEEEEEEEEEssEEsEEEEEEEEEEEEEEEEEEEEEEEEE [ 38%] 109s notebook/services/contents/tests/test_fileio.py ... [ 39%] 109s notebook/services/contents/tests/test_largefilemanager.py . [ 39%] 109s notebook/services/contents/tests/test_manager.py .....s........ss....... [ 46%] 109s ...ss........ [ 50%] 111s notebook/services/kernels/tests/test_kernels_api.py EEEEEEEEEEEE [ 54%] 111s notebook/services/kernelspecs/tests/test_kernelspecs_api.py EEEEEEE [ 56%] 112s notebook/services/nbconvert/tests/test_nbconvert_api.py E [ 56%] 113s notebook/services/sessions/tests/test_sessionmanager.py FFFFFFFFF [ 59%] 115s notebook/services/sessions/tests/test_sessions_api.py EEEEEEEEEEEEEEEEEE [ 64%] 115s EEEE [ 66%] 116s notebook/terminal/tests/test_terminals_api.py EEEEEEEE [ 68%] 116s notebook/tests/test_config_manager.py . [ 68%] 117s notebook/tests/test_files.py EEEEE [ 70%] 118s notebook/tests/test_gateway.py EEEEEE [ 72%] 118s notebook/tests/test_i18n.py . [ 72%] 118s notebook/tests/test_log.py . [ 72%] 118s notebook/tests/test_nbextensions.py ................................... [ 83%] 121s notebook/tests/test_notebookapp.py FFFFFFFFF........F.EEEEEEE [ 91%] 121s notebook/tests/test_paths.py ..E [ 92%] 121s notebook/tests/test_serialize.py .. [ 93%] 122s notebook/tests/test_serverextensions.py ...FF [ 94%] 122s notebook/tests/test_traittypes.py ........... [ 98%] 123s notebook/tests/test_utils.py F...s [ 99%] 123s notebook/tree/tests/test_tree_handler.py E [100%] 123s 123s ==================================== ERRORS ==================================== 123s __________________ ERROR at setup of LoginTest.test_next_bad ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________________ ERROR at setup of LoginTest.test_next_ok ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of BundleAPITest.test_bundler_import_error ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 123s teardown will clean it up in the end.""" 123s > super().setup_class() 123s 123s notebook/bundler/tests/test_bundler_api.py:27: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of BundleAPITest.test_bundler_invoke ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 123s teardown will clean it up in the end.""" 123s > super().setup_class() 123s 123s notebook/bundler/tests/test_bundler_api.py:27: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of BundleAPITest.test_bundler_not_enabled ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 123s teardown will clean it up in the end.""" 123s > super().setup_class() 123s 123s notebook/bundler/tests/test_bundler_api.py:27: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of BundleAPITest.test_missing_bundler_arg ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 123s teardown will clean it up in the end.""" 123s > super().setup_class() 123s 123s notebook/bundler/tests/test_bundler_api.py:27: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of BundleAPITest.test_notebook_not_found ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 123s teardown will clean it up in the end.""" 123s > super().setup_class() 123s 123s notebook/bundler/tests/test_bundler_api.py:27: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________________ ERROR at setup of APITest.test_get_spec ____________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________________ ERROR at setup of APITest.test_get_status ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_no_track_activity _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________ ERROR at setup of APITest.test_create_retrieve_config _____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________________ ERROR at setup of APITest.test_get_unknown __________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________________ ERROR at setup of APITest.test_modify _____________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________________ ERROR at setup of APITest.test_checkpoints __________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of APITest.test_checkpoints_separate_root ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________________ ERROR at setup of APITest.test_copy ______________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_copy_400_hidden ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________________ ERROR at setup of APITest.test_copy_copy ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________________ ERROR at setup of APITest.test_copy_dir_400 __________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________________ ERROR at setup of APITest.test_copy_path ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________________ ERROR at setup of APITest.test_copy_put_400 __________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of APITest.test_copy_put_400_hidden ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_create_untitled ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of APITest.test_create_untitled_txt ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_delete_hidden_dir _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of APITest.test_delete_hidden_file _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_file_checkpoints ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_get_404_hidden _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________________ ERROR at setup of APITest.test_get_bad_type __________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of APITest.test_get_binary_file_contents ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of APITest.test_get_contents_no_such_file ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of APITest.test_get_dir_no_content _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_get_nb_contents ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_get_nb_invalid _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_get_nb_no_content _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________ ERROR at setup of APITest.test_get_text_file_contents _____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________________ ERROR at setup of APITest.test_list_dirs ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of APITest.test_list_nonexistant_dir ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_list_notebooks _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________________ ERROR at setup of APITest.test_mkdir _____________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_mkdir_hidden_400 ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_mkdir_untitled _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________________ ERROR at setup of APITest.test_rename _____________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_rename_400_hidden _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_rename_existing ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________________ ERROR at setup of APITest.test_save ______________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________________ ERROR at setup of APITest.test_upload _____________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________________ ERROR at setup of APITest.test_upload_b64 ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________________ ERROR at setup of APITest.test_upload_txt ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_upload_txt_hidden _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________________ ERROR at setup of APITest.test_upload_v2 ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______ ERROR at setup of GenericFileCheckpointsAPITest.test_checkpoints _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _ ERROR at setup of GenericFileCheckpointsAPITest.test_checkpoints_separate_root _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __ ERROR at setup of GenericFileCheckpointsAPITest.test_config_did_something ___ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of GenericFileCheckpointsAPITest.test_copy ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_400_hidden _____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_copy ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_dir_400 _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_path ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_put_400 _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_put_400_hidden ___ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_create_untitled _____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_create_untitled_txt ___ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_delete_hidden_dir ____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_delete_hidden_file ____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_file_checkpoints _____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_404_hidden ______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______ ERROR at setup of GenericFileCheckpointsAPITest.test_get_bad_type _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _ ERROR at setup of GenericFileCheckpointsAPITest.test_get_binary_file_contents _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _ ERROR at setup of GenericFileCheckpointsAPITest.test_get_contents_no_such_file _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_get_dir_no_content ____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_nb_contents _____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_nb_invalid ______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_nb_no_content ____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _ ERROR at setup of GenericFileCheckpointsAPITest.test_get_text_file_contents __ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_list_dirs ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __ ERROR at setup of GenericFileCheckpointsAPITest.test_list_nonexistant_dir ___ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_list_notebooks ______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of GenericFileCheckpointsAPITest.test_mkdir __________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_mkdir_hidden_400 _____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_mkdir_untitled ______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of GenericFileCheckpointsAPITest.test_rename __________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_rename_400_hidden ____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_rename_existing _____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of GenericFileCheckpointsAPITest.test_save ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of GenericFileCheckpointsAPITest.test_upload __________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_b64 ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_txt ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_txt_hidden ____ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_v2 ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of KernelAPITest.test_connections _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of KernelAPITest.test_default_kernel ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of KernelAPITest.test_kernel_handler ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of KernelAPITest.test_main_kernel_handler ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of KernelAPITest.test_no_kernels ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________ ERROR at setup of AsyncKernelAPITest.test_connections _____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/kernels/tests/test_kernels_api.py:206: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of AsyncKernelAPITest.test_default_kernel ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/kernels/tests/test_kernels_api.py:206: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of AsyncKernelAPITest.test_kernel_handler ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/kernels/tests/test_kernels_api.py:206: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of AsyncKernelAPITest.test_main_kernel_handler _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/kernels/tests/test_kernels_api.py:206: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of AsyncKernelAPITest.test_no_kernels _____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/kernels/tests/test_kernels_api.py:206: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of KernelFilterTest.test_config ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of KernelCullingTest.test_culling _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of APITest.test_get_kernel_resource_file ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of APITest.test_get_kernelspec _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of APITest.test_get_kernelspec_spaces _____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of APITest.test_get_nonexistant_kernelspec ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of APITest.test_get_nonexistant_resource ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______________ ERROR at setup of APITest.test_list_kernelspecs ________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of APITest.test_list_kernelspecs_bad ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________________ ERROR at setup of APITest.test_list_formats __________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________________ ERROR at setup of SessionAPITest.test_create _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of SessionAPITest.test_create_console_session _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of SessionAPITest.test_create_deprecated ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of SessionAPITest.test_create_file_session ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of SessionAPITest.test_create_with_kernel_id __________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________________ ERROR at setup of SessionAPITest.test_delete _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________ ERROR at setup of SessionAPITest.test_modify_kernel_id ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of SessionAPITest.test_modify_kernel_name ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of SessionAPITest.test_modify_path _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of SessionAPITest.test_modify_path_deprecated _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of SessionAPITest.test_modify_type _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of AsyncSessionAPITest.test_create _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______ ERROR at setup of AsyncSessionAPITest.test_create_console_session _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of AsyncSessionAPITest.test_create_deprecated _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of AsyncSessionAPITest.test_create_file_session ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______ ERROR at setup of AsyncSessionAPITest.test_create_with_kernel_id _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of AsyncSessionAPITest.test_delete _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of AsyncSessionAPITest.test_modify_kernel_id __________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of AsyncSessionAPITest.test_modify_kernel_name _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________ ERROR at setup of AsyncSessionAPITest.test_modify_path ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______ ERROR at setup of AsyncSessionAPITest.test_modify_path_deprecated _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________ ERROR at setup of AsyncSessionAPITest.test_modify_type ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 123s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 123s > super().setup_class() 123s 123s notebook/services/sessions/tests/test_sessions_api.py:274: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ____________ ERROR at setup of TerminalAPITest.test_create_terminal ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________ ERROR at setup of TerminalAPITest.test_create_terminal_via_get ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______ ERROR at setup of TerminalAPITest.test_create_terminal_with_name _______ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of TerminalAPITest.test_no_terminals ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of TerminalAPITest.test_terminal_handler ____________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of TerminalAPITest.test_terminal_root_handler _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of TerminalCullingTest.test_config _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of TerminalCullingTest.test_culling ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of FilesTest.test_contents_manager _______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________________ ERROR at setup of FilesTest.test_download ___________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ________________ ERROR at setup of FilesTest.test_hidden_files _________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _____________ ERROR at setup of FilesTest.test_old_files_redirect ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________________ ERROR at setup of FilesTest.test_view_html __________________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of TestGateway.test_gateway_class_mappings ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s GatewayClient.clear_instance() 123s > super().setup_class() 123s 123s notebook/tests/test_gateway.py:138: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of TestGateway.test_gateway_get_kernelspecs __________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s GatewayClient.clear_instance() 123s > super().setup_class() 123s 123s notebook/tests/test_gateway.py:138: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _______ ERROR at setup of TestGateway.test_gateway_get_named_kernelspec ________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s GatewayClient.clear_instance() 123s > super().setup_class() 123s 123s notebook/tests/test_gateway.py:138: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of TestGateway.test_gateway_kernel_lifecycle __________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s GatewayClient.clear_instance() 123s > super().setup_class() 123s 123s notebook/tests/test_gateway.py:138: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ______________ ERROR at setup of TestGateway.test_gateway_options ______________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s GatewayClient.clear_instance() 123s > super().setup_class() 123s 123s notebook/tests/test_gateway.py:138: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of TestGateway.test_gateway_session_lifecycle _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s GatewayClient.clear_instance() 123s > super().setup_class() 123s 123s notebook/tests/test_gateway.py:138: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:198: in setup_class 123s cls.wait_until_alive() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s _________ ERROR at setup of NotebookAppTests.test_list_running_servers _________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s ___________ ERROR at setup of NotebookAppTests.test_log_json_default ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 123s 123s `release_conn` will only behave as expected if 123s `preload_content=False` because we want to make 123s `preload_content=False` the default behaviour someday soon without 123s breaking backwards compatibility. 123s 123s :param method: 123s HTTP request method (such as GET, POST, PUT, etc.) 123s 123s :param url: 123s The URL to perform the request on. 123s 123s :param body: 123s Data to send in the request body, either :class:`str`, :class:`bytes`, 123s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 123s 123s :param headers: 123s Dictionary of custom headers to send, such as User-Agent, 123s If-None-Match, etc. If None, pool headers are used. If provided, 123s these headers completely replace any pool-specific headers. 123s 123s :param retries: 123s Configure the number of retries to allow before raising a 123s :class:`~urllib3.exceptions.MaxRetryError` exception. 123s 123s Pass ``None`` to retry until you receive a response. Pass a 123s :class:`~urllib3.util.retry.Retry` object for fine-grained control 123s over different types of retries. 123s Pass an integer number to retry connection errors that many times, 123s but no other types of errors. Pass zero to never retry. 123s 123s If ``False``, then retries are disabled and any exception is raised 123s immediately. Also, instead of raising a MaxRetryError on redirects, 123s the redirect response will be returned. 123s 123s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 123s 123s :param redirect: 123s If True, automatically handle redirects (status codes 301, 302, 123s 303, 307, 308). Each redirect counts as a retry. Disabling retries 123s will disable redirect, too. 123s 123s :param assert_same_host: 123s If ``True``, will make sure that the host of the pool requests is 123s consistent else will raise HostChangedError. When ``False``, you can 123s use the pool on an HTTP proxy and request foreign hosts. 123s 123s :param timeout: 123s If specified, overrides the default timeout for this one 123s request. It may be a float (in seconds) or an instance of 123s :class:`urllib3.util.Timeout`. 123s 123s :param pool_timeout: 123s If set and the pool is set to block=True, then this method will 123s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 123s connection is available within the time period. 123s 123s :param bool preload_content: 123s If True, the response's body will be preloaded into memory. 123s 123s :param bool decode_content: 123s If True, will attempt to decode the body based on the 123s 'content-encoding' header. 123s 123s :param release_conn: 123s If False, then the urlopen call will not release the connection 123s back into the pool once a response is received (but will release if 123s you read the entire contents of the response such as when 123s `preload_content=True`). This is useful if you're not preloading 123s the response's content immediately. You will need to call 123s ``r.release_conn()`` on the response ``r`` to return the connection 123s back into the pool. If None, it takes the value of ``preload_content`` 123s which defaults to ``True``. 123s 123s :param bool chunked: 123s If True, urllib3 will send the body using chunked transfer 123s encoding. Otherwise, urllib3 will send the body using the standard 123s content-length form. Defaults to False. 123s 123s :param int body_pos: 123s Position to seek to in file-like body in the event of a retry or 123s redirect. Typically this won't need to be set because urllib3 will 123s auto-populate the value when needed. 123s """ 123s parsed_url = parse_url(url) 123s destination_scheme = parsed_url.scheme 123s 123s if headers is None: 123s headers = self.headers 123s 123s if not isinstance(retries, Retry): 123s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 123s 123s if release_conn is None: 123s release_conn = preload_content 123s 123s # Check host 123s if assert_same_host and not self.is_same_host(url): 123s raise HostChangedError(self, url, retries) 123s 123s # Ensure that the URL we're connecting to is properly encoded 123s if url.startswith("/"): 123s url = to_str(_encode_target(url)) 123s else: 123s url = to_str(parsed_url.url) 123s 123s conn = None 123s 123s # Track whether `conn` needs to be released before 123s # returning/raising/recursing. Update this variable if necessary, and 123s # leave `release_conn` constant throughout the function. That way, if 123s # the function recurses, the original value of `release_conn` will be 123s # passed down into the recursive call, and its value will be respected. 123s # 123s # See issue #651 [1] for details. 123s # 123s # [1] 123s release_this_conn = release_conn 123s 123s http_tunnel_required = connection_requires_http_tunnel( 123s self.proxy, self.proxy_config, destination_scheme 123s ) 123s 123s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 123s # have to copy the headers dict so we can safely change it without those 123s # changes being reflected in anyone else's copy. 123s if not http_tunnel_required: 123s headers = headers.copy() # type: ignore[attr-defined] 123s headers.update(self.proxy_headers) # type: ignore[union-attr] 123s 123s # Must keep the exception bound to a separate variable or else Python 3 123s # complains about UnboundLocalError. 123s err = None 123s 123s # Keep track of whether we cleanly exited the except block. This 123s # ensures we do proper cleanup in finally. 123s clean_exit = False 123s 123s # Rewind body position, if needed. Record current position 123s # for future rewinds in the event of a redirect/retry. 123s body_pos = set_file_position(body, body_pos) 123s 123s try: 123s # Request a connection from the queue. 123s timeout_obj = self._get_timeout(timeout) 123s conn = self._get_conn(timeout=pool_timeout) 123s 123s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 123s 123s # Is this a closed/new connection that requires CONNECT tunnelling? 123s if self.proxy is not None and http_tunnel_required and conn.is_closed: 123s try: 123s self._prepare_proxy(conn) 123s except (BaseSSLError, OSError, SocketTimeout) as e: 123s self._raise_timeout( 123s err=e, url=self.proxy.url, timeout_value=conn.timeout 123s ) 123s raise 123s 123s # If we're going to release the connection in ``finally:``, then 123s # the response doesn't need to know about the connection. Otherwise 123s # it will also try to release it and we'll have a double-release 123s # mess. 123s response_conn = conn if not release_conn else None 123s 123s # Make the request on the HTTPConnection object 123s > response = self._make_request( 123s conn, 123s method, 123s url, 123s timeout=timeout_obj, 123s body=body, 123s headers=headers, 123s chunked=chunked, 123s retries=retries, 123s response_conn=response_conn, 123s preload_content=preload_content, 123s decode_content=decode_content, 123s **response_kw, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 123s conn.request( 123s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 123s self.endheaders() 123s /usr/lib/python3.12/http/client.py:1331: in endheaders 123s self._send_output(message_body, encode_chunked=encode_chunked) 123s /usr/lib/python3.12/http/client.py:1091: in _send_output 123s self.send(msg) 123s /usr/lib/python3.12/http/client.py:1035: in send 123s self.connect() 123s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 123s self.sock = self._new_conn() 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s except socket.gaierror as e: 123s raise NameResolutionError(self.host, self, e) from e 123s except SocketTimeout as e: 123s raise ConnectTimeoutError( 123s self, 123s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 123s ) from e 123s 123s except OSError as e: 123s > raise NewConnectionError( 123s self, f"Failed to establish a new connection: {e}" 123s ) from e 123s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s > resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:486: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 123s retries = retries.increment( 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s method = 'GET', url = '/a%40b/api/contents', response = None 123s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 123s _pool = 123s _stacktrace = 123s 123s def increment( 123s self, 123s method: str | None = None, 123s url: str | None = None, 123s response: BaseHTTPResponse | None = None, 123s error: Exception | None = None, 123s _pool: ConnectionPool | None = None, 123s _stacktrace: TracebackType | None = None, 123s ) -> Retry: 123s """Return a new Retry object with incremented retry counters. 123s 123s :param response: A response object, or None, if the server did not 123s return a response. 123s :type response: :class:`~urllib3.response.BaseHTTPResponse` 123s :param Exception error: An error encountered during the request, or 123s None if the response was received successfully. 123s 123s :return: A new ``Retry`` object. 123s """ 123s if self.total is False and error: 123s # Disabled, indicate to re-raise the error. 123s raise reraise(type(error), error, _stacktrace) 123s 123s total = self.total 123s if total is not None: 123s total -= 1 123s 123s connect = self.connect 123s read = self.read 123s redirect = self.redirect 123s status_count = self.status 123s other = self.other 123s cause = "unknown" 123s status = None 123s redirect_location = None 123s 123s if error and self._is_connection_error(error): 123s # Connect retry? 123s if connect is False: 123s raise reraise(type(error), error, _stacktrace) 123s elif connect is not None: 123s connect -= 1 123s 123s elif error and self._is_read_error(error): 123s # Read retry? 123s if read is False or method is None or not self._is_method_retryable(method): 123s raise reraise(type(error), error, _stacktrace) 123s elif read is not None: 123s read -= 1 123s 123s elif error: 123s # Other retry? 123s if other is not None: 123s other -= 1 123s 123s elif response and response.get_redirect_location(): 123s # Redirect retry? 123s if redirect is not None: 123s redirect -= 1 123s cause = "too many redirects" 123s response_redirect_location = response.get_redirect_location() 123s if response_redirect_location: 123s redirect_location = response_redirect_location 123s status = response.status 123s 123s else: 123s # Incrementing because of a server error like a 500 in 123s # status_forcelist and the given method is in the allowed_methods 123s cause = ResponseError.GENERIC_ERROR 123s if response and response.status: 123s if status_count is not None: 123s status_count -= 1 123s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 123s status = response.status 123s 123s history = self.history + ( 123s RequestHistory(method, url, error, status, redirect_location), 123s ) 123s 123s new_retry = self.new( 123s total=total, 123s connect=connect, 123s read=read, 123s redirect=redirect, 123s status=status_count, 123s other=other, 123s history=history, 123s ) 123s 123s if new_retry.is_exhausted(): 123s reason = error or ResponseError(cause) 123s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 123s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 123s 123s During handling of the above exception, another exception occurred: 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s > cls.fetch_url(url) 123s 123s notebook/tests/launchnotebook.py:53: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s notebook/tests/launchnotebook.py:82: in fetch_url 123s return requests.get(url) 123s /usr/lib/python3/dist-packages/requests/api.py:73: in get 123s return request("get", url, params=params, **kwargs) 123s /usr/lib/python3/dist-packages/requests/api.py:59: in request 123s return session.request(method=method, url=url, **kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 123s resp = self.send(prep, **send_kwargs) 123s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 123s r = adapter.send(request, **kwargs) 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s self = 123s request = , stream = False 123s timeout = Timeout(connect=None, read=None, total=None), verify = True 123s cert = None, proxies = OrderedDict() 123s 123s def send( 123s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 123s ): 123s """Sends PreparedRequest object. Returns Response object. 123s 123s :param request: The :class:`PreparedRequest ` being sent. 123s :param stream: (optional) Whether to stream the request content. 123s :param timeout: (optional) How long to wait for the server to send 123s data before giving up, as a float, or a :ref:`(connect timeout, 123s read timeout) ` tuple. 123s :type timeout: float or tuple or urllib3 Timeout object 123s :param verify: (optional) Either a boolean, in which case it controls whether 123s we verify the server's TLS certificate, or a string, in which case it 123s must be a path to a CA bundle to use 123s :param cert: (optional) Any user-provided SSL certificate to be trusted. 123s :param proxies: (optional) The proxies dictionary to apply to the request. 123s :rtype: requests.Response 123s """ 123s 123s try: 123s conn = self.get_connection(request.url, proxies) 123s except LocationValueError as e: 123s raise InvalidURL(e, request=request) 123s 123s self.cert_verify(conn, request.url, verify, cert) 123s url = self.request_url(request, proxies) 123s self.add_headers( 123s request, 123s stream=stream, 123s timeout=timeout, 123s verify=verify, 123s cert=cert, 123s proxies=proxies, 123s ) 123s 123s chunked = not (request.body is None or "Content-Length" in request.headers) 123s 123s if isinstance(timeout, tuple): 123s try: 123s connect, read = timeout 123s timeout = TimeoutSauce(connect=connect, read=read) 123s except ValueError: 123s raise ValueError( 123s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 123s f"or a single float to set both timeouts to the same value." 123s ) 123s elif isinstance(timeout, TimeoutSauce): 123s pass 123s else: 123s timeout = TimeoutSauce(connect=timeout, read=timeout) 123s 123s try: 123s resp = conn.urlopen( 123s method=request.method, 123s url=url, 123s body=request.body, 123s headers=request.headers, 123s redirect=False, 123s assert_same_host=False, 123s preload_content=False, 123s decode_content=False, 123s retries=self.max_retries, 123s timeout=timeout, 123s chunked=chunked, 123s ) 123s 123s except (ProtocolError, OSError) as err: 123s raise ConnectionError(err, request=request) 123s 123s except MaxRetryError as e: 123s if isinstance(e.reason, ConnectTimeoutError): 123s # TODO: Remove this in 3.0.0: see #2811 123s if not isinstance(e.reason, NewConnectionError): 123s raise ConnectTimeout(e, request=request) 123s 123s if isinstance(e.reason, ResponseError): 123s raise RetryError(e, request=request) 123s 123s if isinstance(e.reason, _ProxyError): 123s raise ProxyError(e, request=request) 123s 123s if isinstance(e.reason, _SSLError): 123s # This branch is for urllib3 v1.22 and later. 123s raise SSLError(e, request=request) 123s 123s > raise ConnectionError(e, request=request) 123s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 123s 123s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 123s 123s The above exception was the direct cause of the following exception: 123s 123s cls = 123s 123s @classmethod 123s def setup_class(cls): 123s cls.tmp_dir = TemporaryDirectory() 123s def tmp(*parts): 123s path = os.path.join(cls.tmp_dir.name, *parts) 123s try: 123s os.makedirs(path) 123s except OSError as e: 123s if e.errno != errno.EEXIST: 123s raise 123s return path 123s 123s cls.home_dir = tmp('home') 123s data_dir = cls.data_dir = tmp('data') 123s config_dir = cls.config_dir = tmp('config') 123s runtime_dir = cls.runtime_dir = tmp('runtime') 123s cls.notebook_dir = tmp('notebooks') 123s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 123s cls.env_patch.start() 123s # Patch systemwide & user-wide data & config directories, to isolate 123s # the tests from oddities of the local setup. But leave Python env 123s # locations alone, so data files for e.g. nbconvert are accessible. 123s # If this isolation isn't sufficient, you may need to run the tests in 123s # a virtualenv or conda env. 123s cls.path_patch = patch.multiple( 123s jupyter_core.paths, 123s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 123s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 123s ) 123s cls.path_patch.start() 123s 123s config = cls.config or Config() 123s config.NotebookNotary.db_file = ':memory:' 123s 123s cls.token = hexlify(os.urandom(4)).decode('ascii') 123s 123s started = Event() 123s def start_thread(): 123s try: 123s bind_args = cls.get_bind_args() 123s app = cls.notebook = NotebookApp( 123s port_retries=0, 123s open_browser=False, 123s config_dir=cls.config_dir, 123s data_dir=cls.data_dir, 123s runtime_dir=cls.runtime_dir, 123s notebook_dir=cls.notebook_dir, 123s base_url=cls.url_prefix, 123s config=config, 123s allow_root=True, 123s token=cls.token, 123s **bind_args 123s ) 123s if "asyncio" in sys.modules: 123s app._init_asyncio_patch() 123s import asyncio 123s 123s asyncio.set_event_loop(asyncio.new_event_loop()) 123s # Patch the current loop in order to match production 123s # behavior 123s import nest_asyncio 123s 123s nest_asyncio.apply() 123s # don't register signal handler during tests 123s app.init_signal = lambda : None 123s # clear log handlers and propagate to root for nose to capture it 123s # needs to be redone after initialize, which reconfigures logging 123s app.log.propagate = True 123s app.log.handlers = [] 123s app.initialize(argv=cls.get_argv()) 123s app.log.propagate = True 123s app.log.handlers = [] 123s loop = IOLoop.current() 123s loop.add_callback(started.set) 123s app.start() 123s finally: 123s # set the event, so failure to start doesn't cause a hang 123s started.set() 123s app.session_manager.close() 123s cls.notebook_thread = Thread(target=start_thread) 123s cls.notebook_thread.daemon = True 123s cls.notebook_thread.start() 123s started.wait() 123s > cls.wait_until_alive() 123s 123s notebook/tests/launchnotebook.py:198: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s cls = 123s 123s @classmethod 123s def wait_until_alive(cls): 123s """Wait for the server to be alive""" 123s url = cls.base_url() + 'api/contents' 123s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 123s try: 123s cls.fetch_url(url) 123s except ModuleNotFoundError as error: 123s # Errors that should be immediately thrown back to caller 123s raise error 123s except Exception as e: 123s if not cls.notebook_thread.is_alive(): 123s > raise RuntimeError("The notebook server failed to start") from e 123s E RuntimeError: The notebook server failed to start 123s 123s notebook/tests/launchnotebook.py:59: RuntimeError 123s __________ ERROR at setup of NotebookAppTests.test_validate_log_json ___________ 123s 123s self = 123s 123s def _new_conn(self) -> socket.socket: 123s """Establish a socket connection and set nodelay settings on it. 123s 123s :return: New socket connection. 123s """ 123s try: 123s > sock = connection.create_connection( 123s (self._dns_host, self.port), 123s self.timeout, 123s source_address=self.source_address, 123s socket_options=self.socket_options, 123s ) 123s 123s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 123s raise err 123s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 123s 123s address = ('localhost', 12341), timeout = None, source_address = None 123s socket_options = [(6, 1, 1)] 123s 123s def create_connection( 123s address: tuple[str, int], 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s source_address: tuple[str, int] | None = None, 123s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 123s ) -> socket.socket: 123s """Connect to *address* and return the socket object. 123s 123s Convenience function. Connect to *address* (a 2-tuple ``(host, 123s port)``) and return the socket object. Passing the optional 123s *timeout* parameter will set the timeout on the socket instance 123s before attempting to connect. If no *timeout* is supplied, the 123s global default timeout setting returned by :func:`socket.getdefaulttimeout` 123s is used. If *source_address* is set it must be a tuple of (host, port) 123s for the socket to bind as a source address before making the connection. 123s An host of '' or port 0 tells the OS to use the default. 123s """ 123s 123s host, port = address 123s if host.startswith("["): 123s host = host.strip("[]") 123s err = None 123s 123s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 123s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 123s # The original create_connection function always returns all records. 123s family = allowed_gai_family() 123s 123s try: 123s host.encode("idna") 123s except UnicodeError: 123s raise LocationParseError(f"'{host}', label empty or too long") from None 123s 123s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 123s af, socktype, proto, canonname, sa = res 123s sock = None 123s try: 123s sock = socket.socket(af, socktype, proto) 123s 123s # If provided, set socket level options before connecting. 123s _set_socket_options(sock, socket_options) 123s 123s if timeout is not _DEFAULT_TIMEOUT: 123s sock.settimeout(timeout) 123s if source_address: 123s sock.bind(source_address) 123s > sock.connect(sa) 123s E ConnectionRefusedError: [Errno 111] Connection refused 123s 123s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 123s 123s The above exception was the direct cause of the following exception: 123s 123s self = 123s method = 'GET', url = '/a%40b/api/contents', body = None 123s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 123s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 123s redirect = False, assert_same_host = False 123s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 123s release_conn = False, chunked = False, body_pos = None, preload_content = False 123s decode_content = False, response_kw = {} 123s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 123s destination_scheme = None, conn = None, release_this_conn = True 123s http_tunnel_required = False, err = None, clean_exit = False 123s 123s def urlopen( # type: ignore[override] 123s self, 123s method: str, 123s url: str, 123s body: _TYPE_BODY | None = None, 123s headers: typing.Mapping[str, str] | None = None, 123s retries: Retry | bool | int | None = None, 123s redirect: bool = True, 123s assert_same_host: bool = True, 123s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 123s pool_timeout: int | None = None, 123s release_conn: bool | None = None, 123s chunked: bool = False, 123s body_pos: _TYPE_BODY_POSITION | None = None, 123s preload_content: bool = True, 123s decode_content: bool = True, 123s **response_kw: typing.Any, 123s ) -> BaseHTTPResponse: 123s """ 123s Get a connection from the pool and perform an HTTP request. This is the 123s lowest level call for making a request, so you'll need to specify all 123s the raw details. 123s 123s .. note:: 123s 123s More commonly, it's appropriate to use a convenience method 123s such as :meth:`request`. 123s 123s .. note:: 124s 124s `release_conn` will only behave as expected if 124s `preload_content=False` because we want to make 124s `preload_content=False` the default behaviour someday soon without 124s breaking backwards compatibility. 124s 124s :param method: 124s HTTP request method (such as GET, POST, PUT, etc.) 124s 124s :param url: 124s The URL to perform the request on. 124s 124s :param body: 124s Data to send in the request body, either :class:`str`, :class:`bytes`, 124s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 124s 124s :param headers: 124s Dictionary of custom headers to send, such as User-Agent, 124s If-None-Match, etc. If None, pool headers are used. If provided, 124s these headers completely replace any pool-specific headers. 124s 124s :param retries: 124s Configure the number of retries to allow before raising a 124s :class:`~urllib3.exceptions.MaxRetryError` exception. 124s 124s Pass ``None`` to retry until you receive a response. Pass a 124s :class:`~urllib3.util.retry.Retry` object for fine-grained control 124s over different types of retries. 124s Pass an integer number to retry connection errors that many times, 124s but no other types of errors. Pass zero to never retry. 124s 124s If ``False``, then retries are disabled and any exception is raised 124s immediately. Also, instead of raising a MaxRetryError on redirects, 124s the redirect response will be returned. 124s 124s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 124s 124s :param redirect: 124s If True, automatically handle redirects (status codes 301, 302, 124s 303, 307, 308). Each redirect counts as a retry. Disabling retries 124s will disable redirect, too. 124s 124s :param assert_same_host: 124s If ``True``, will make sure that the host of the pool requests is 124s consistent else will raise HostChangedError. When ``False``, you can 124s use the pool on an HTTP proxy and request foreign hosts. 124s 124s :param timeout: 124s If specified, overrides the default timeout for this one 124s request. It may be a float (in seconds) or an instance of 124s :class:`urllib3.util.Timeout`. 124s 124s :param pool_timeout: 124s If set and the pool is set to block=True, then this method will 124s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 124s connection is available within the time period. 124s 124s :param bool preload_content: 124s If True, the response's body will be preloaded into memory. 124s 124s :param bool decode_content: 124s If True, will attempt to decode the body based on the 124s 'content-encoding' header. 124s 124s :param release_conn: 124s If False, then the urlopen call will not release the connection 124s back into the pool once a response is received (but will release if 124s you read the entire contents of the response such as when 124s `preload_content=True`). This is useful if you're not preloading 124s the response's content immediately. You will need to call 124s ``r.release_conn()`` on the response ``r`` to return the connection 124s back into the pool. If None, it takes the value of ``preload_content`` 124s which defaults to ``True``. 124s 124s :param bool chunked: 124s If True, urllib3 will send the body using chunked transfer 124s encoding. Otherwise, urllib3 will send the body using the standard 124s content-length form. Defaults to False. 124s 124s :param int body_pos: 124s Position to seek to in file-like body in the event of a retry or 124s redirect. Typically this won't need to be set because urllib3 will 124s auto-populate the value when needed. 124s """ 124s parsed_url = parse_url(url) 124s destination_scheme = parsed_url.scheme 124s 124s if headers is None: 124s headers = self.headers 124s 124s if not isinstance(retries, Retry): 124s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 124s 124s if release_conn is None: 124s release_conn = preload_content 124s 124s # Check host 124s if assert_same_host and not self.is_same_host(url): 124s raise HostChangedError(self, url, retries) 124s 124s # Ensure that the URL we're connecting to is properly encoded 124s if url.startswith("/"): 124s url = to_str(_encode_target(url)) 124s else: 124s url = to_str(parsed_url.url) 124s 124s conn = None 124s 124s # Track whether `conn` needs to be released before 124s # returning/raising/recursing. Update this variable if necessary, and 124s # leave `release_conn` constant throughout the function. That way, if 124s # the function recurses, the original value of `release_conn` will be 124s # passed down into the recursive call, and its value will be respected. 124s # 124s # See issue #651 [1] for details. 124s # 124s # [1] 124s release_this_conn = release_conn 124s 124s http_tunnel_required = connection_requires_http_tunnel( 124s self.proxy, self.proxy_config, destination_scheme 124s ) 124s 124s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 124s # have to copy the headers dict so we can safely change it without those 124s # changes being reflected in anyone else's copy. 124s if not http_tunnel_required: 124s headers = headers.copy() # type: ignore[attr-defined] 124s headers.update(self.proxy_headers) # type: ignore[union-attr] 124s 124s # Must keep the exception bound to a separate variable or else Python 3 124s # complains about UnboundLocalError. 124s err = None 124s 124s # Keep track of whether we cleanly exited the except block. This 124s # ensures we do proper cleanup in finally. 124s clean_exit = False 124s 124s # Rewind body position, if needed. Record current position 124s # for future rewinds in the event of a redirect/retry. 124s body_pos = set_file_position(body, body_pos) 124s 124s try: 124s # Request a connection from the queue. 124s timeout_obj = self._get_timeout(timeout) 124s conn = self._get_conn(timeout=pool_timeout) 124s 124s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 124s 124s # Is this a closed/new connection that requires CONNECT tunnelling? 124s if self.proxy is not None and http_tunnel_required and conn.is_closed: 124s try: 124s self._prepare_proxy(conn) 124s except (BaseSSLError, OSError, SocketTimeout) as e: 124s self._raise_timeout( 124s err=e, url=self.proxy.url, timeout_value=conn.timeout 124s ) 124s raise 124s 124s # If we're going to release the connection in ``finally:``, then 124s # the response doesn't need to know about the connection. Otherwise 124s # it will also try to release it and we'll have a double-release 124s # mess. 124s response_conn = conn if not release_conn else None 124s 124s # Make the request on the HTTPConnection object 124s > response = self._make_request( 124s conn, 124s method, 124s url, 124s timeout=timeout_obj, 124s body=body, 124s headers=headers, 124s chunked=chunked, 124s retries=retries, 124s response_conn=response_conn, 124s preload_content=preload_content, 124s decode_content=decode_content, 124s **response_kw, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 124s self.sock = self._new_conn() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s except socket.gaierror as e: 124s raise NameResolutionError(self.host, self, e) from e 124s except SocketTimeout as e: 124s raise ConnectTimeoutError( 124s self, 124s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 124s ) from e 124s 124s except OSError as e: 124s > raise NewConnectionError( 124s self, f"Failed to establish a new connection: {e}" 124s ) from e 124s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s > resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:486: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 124s retries = retries.increment( 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s method = 'GET', url = '/a%40b/api/contents', response = None 124s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 124s _pool = 124s _stacktrace = 124s 124s def increment( 124s self, 124s method: str | None = None, 124s url: str | None = None, 124s response: BaseHTTPResponse | None = None, 124s error: Exception | None = None, 124s _pool: ConnectionPool | None = None, 124s _stacktrace: TracebackType | None = None, 124s ) -> Retry: 124s """Return a new Retry object with incremented retry counters. 124s 124s :param response: A response object, or None, if the server did not 124s return a response. 124s :type response: :class:`~urllib3.response.BaseHTTPResponse` 124s :param Exception error: An error encountered during the request, or 124s None if the response was received successfully. 124s 124s :return: A new ``Retry`` object. 124s """ 124s if self.total is False and error: 124s # Disabled, indicate to re-raise the error. 124s raise reraise(type(error), error, _stacktrace) 124s 124s total = self.total 124s if total is not None: 124s total -= 1 124s 124s connect = self.connect 124s read = self.read 124s redirect = self.redirect 124s status_count = self.status 124s other = self.other 124s cause = "unknown" 124s status = None 124s redirect_location = None 124s 124s if error and self._is_connection_error(error): 124s # Connect retry? 124s if connect is False: 124s raise reraise(type(error), error, _stacktrace) 124s elif connect is not None: 124s connect -= 1 124s 124s elif error and self._is_read_error(error): 124s # Read retry? 124s if read is False or method is None or not self._is_method_retryable(method): 124s raise reraise(type(error), error, _stacktrace) 124s elif read is not None: 124s read -= 1 124s 124s elif error: 124s # Other retry? 124s if other is not None: 124s other -= 1 124s 124s elif response and response.get_redirect_location(): 124s # Redirect retry? 124s if redirect is not None: 124s redirect -= 1 124s cause = "too many redirects" 124s response_redirect_location = response.get_redirect_location() 124s if response_redirect_location: 124s redirect_location = response_redirect_location 124s status = response.status 124s 124s else: 124s # Incrementing because of a server error like a 500 in 124s # status_forcelist and the given method is in the allowed_methods 124s cause = ResponseError.GENERIC_ERROR 124s if response and response.status: 124s if status_count is not None: 124s status_count -= 1 124s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 124s status = response.status 124s 124s history = self.history + ( 124s RequestHistory(method, url, error, status, redirect_location), 124s ) 124s 124s new_retry = self.new( 124s total=total, 124s connect=connect, 124s read=read, 124s redirect=redirect, 124s status=status_count, 124s other=other, 124s history=history, 124s ) 124s 124s if new_retry.is_exhausted(): 124s reason = error or ResponseError(cause) 124s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 124s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 124s 124s During handling of the above exception, another exception occurred: 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s > cls.fetch_url(url) 124s 124s notebook/tests/launchnotebook.py:53: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:82: in fetch_url 124s return requests.get(url) 124s /usr/lib/python3/dist-packages/requests/api.py:73: in get 124s return request("get", url, params=params, **kwargs) 124s /usr/lib/python3/dist-packages/requests/api.py:59: in request 124s return session.request(method=method, url=url, **kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 124s resp = self.send(prep, **send_kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 124s r = adapter.send(request, **kwargs) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s except (ProtocolError, OSError) as err: 124s raise ConnectionError(err, request=request) 124s 124s except MaxRetryError as e: 124s if isinstance(e.reason, ConnectTimeoutError): 124s # TODO: Remove this in 3.0.0: see #2811 124s if not isinstance(e.reason, NewConnectionError): 124s raise ConnectTimeout(e, request=request) 124s 124s if isinstance(e.reason, ResponseError): 124s raise RetryError(e, request=request) 124s 124s if isinstance(e.reason, _ProxyError): 124s raise ProxyError(e, request=request) 124s 124s if isinstance(e.reason, _SSLError): 124s # This branch is for urllib3 v1.22 and later. 124s raise SSLError(e, request=request) 124s 124s > raise ConnectionError(e, request=request) 124s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s cls = 124s 124s @classmethod 124s def setup_class(cls): 124s cls.tmp_dir = TemporaryDirectory() 124s def tmp(*parts): 124s path = os.path.join(cls.tmp_dir.name, *parts) 124s try: 124s os.makedirs(path) 124s except OSError as e: 124s if e.errno != errno.EEXIST: 124s raise 124s return path 124s 124s cls.home_dir = tmp('home') 124s data_dir = cls.data_dir = tmp('data') 124s config_dir = cls.config_dir = tmp('config') 124s runtime_dir = cls.runtime_dir = tmp('runtime') 124s cls.notebook_dir = tmp('notebooks') 124s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 124s cls.env_patch.start() 124s # Patch systemwide & user-wide data & config directories, to isolate 124s # the tests from oddities of the local setup. But leave Python env 124s # locations alone, so data files for e.g. nbconvert are accessible. 124s # If this isolation isn't sufficient, you may need to run the tests in 124s # a virtualenv or conda env. 124s cls.path_patch = patch.multiple( 124s jupyter_core.paths, 124s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 124s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 124s ) 124s cls.path_patch.start() 124s 124s config = cls.config or Config() 124s config.NotebookNotary.db_file = ':memory:' 124s 124s cls.token = hexlify(os.urandom(4)).decode('ascii') 124s 124s started = Event() 124s def start_thread(): 124s try: 124s bind_args = cls.get_bind_args() 124s app = cls.notebook = NotebookApp( 124s port_retries=0, 124s open_browser=False, 124s config_dir=cls.config_dir, 124s data_dir=cls.data_dir, 124s runtime_dir=cls.runtime_dir, 124s notebook_dir=cls.notebook_dir, 124s base_url=cls.url_prefix, 124s config=config, 124s allow_root=True, 124s token=cls.token, 124s **bind_args 124s ) 124s if "asyncio" in sys.modules: 124s app._init_asyncio_patch() 124s import asyncio 124s 124s asyncio.set_event_loop(asyncio.new_event_loop()) 124s # Patch the current loop in order to match production 124s # behavior 124s import nest_asyncio 124s 124s nest_asyncio.apply() 124s # don't register signal handler during tests 124s app.init_signal = lambda : None 124s # clear log handlers and propagate to root for nose to capture it 124s # needs to be redone after initialize, which reconfigures logging 124s app.log.propagate = True 124s app.log.handlers = [] 124s app.initialize(argv=cls.get_argv()) 124s app.log.propagate = True 124s app.log.handlers = [] 124s loop = IOLoop.current() 124s loop.add_callback(started.set) 124s app.start() 124s finally: 124s # set the event, so failure to start doesn't cause a hang 124s started.set() 124s app.session_manager.close() 124s cls.notebook_thread = Thread(target=start_thread) 124s cls.notebook_thread.daemon = True 124s cls.notebook_thread.start() 124s started.wait() 124s > cls.wait_until_alive() 124s 124s notebook/tests/launchnotebook.py:198: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s cls.fetch_url(url) 124s except ModuleNotFoundError as error: 124s # Errors that should be immediately thrown back to caller 124s raise error 124s except Exception as e: 124s if not cls.notebook_thread.is_alive(): 124s > raise RuntimeError("The notebook server failed to start") from e 124s E RuntimeError: The notebook server failed to start 124s 124s notebook/tests/launchnotebook.py:59: RuntimeError 124s ___ ERROR at setup of NotebookUnixSocketTests.test_list_running_sock_servers ___ 124s 124s self = 124s method = 'GET', url = '/a%40b/api/contents', body = None 124s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 124s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s redirect = False, assert_same_host = False 124s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 124s release_conn = False, chunked = False, body_pos = None, preload_content = False 124s decode_content = False, response_kw = {} 124s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 124s destination_scheme = None, conn = None, release_this_conn = True 124s http_tunnel_required = False, err = None, clean_exit = False 124s 124s def urlopen( # type: ignore[override] 124s self, 124s method: str, 124s url: str, 124s body: _TYPE_BODY | None = None, 124s headers: typing.Mapping[str, str] | None = None, 124s retries: Retry | bool | int | None = None, 124s redirect: bool = True, 124s assert_same_host: bool = True, 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s pool_timeout: int | None = None, 124s release_conn: bool | None = None, 124s chunked: bool = False, 124s body_pos: _TYPE_BODY_POSITION | None = None, 124s preload_content: bool = True, 124s decode_content: bool = True, 124s **response_kw: typing.Any, 124s ) -> BaseHTTPResponse: 124s """ 124s Get a connection from the pool and perform an HTTP request. This is the 124s lowest level call for making a request, so you'll need to specify all 124s the raw details. 124s 124s .. note:: 124s 124s More commonly, it's appropriate to use a convenience method 124s such as :meth:`request`. 124s 124s .. note:: 124s 124s `release_conn` will only behave as expected if 124s `preload_content=False` because we want to make 124s `preload_content=False` the default behaviour someday soon without 124s breaking backwards compatibility. 124s 124s :param method: 124s HTTP request method (such as GET, POST, PUT, etc.) 124s 124s :param url: 124s The URL to perform the request on. 124s 124s :param body: 124s Data to send in the request body, either :class:`str`, :class:`bytes`, 124s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 124s 124s :param headers: 124s Dictionary of custom headers to send, such as User-Agent, 124s If-None-Match, etc. If None, pool headers are used. If provided, 124s these headers completely replace any pool-specific headers. 124s 124s :param retries: 124s Configure the number of retries to allow before raising a 124s :class:`~urllib3.exceptions.MaxRetryError` exception. 124s 124s Pass ``None`` to retry until you receive a response. Pass a 124s :class:`~urllib3.util.retry.Retry` object for fine-grained control 124s over different types of retries. 124s Pass an integer number to retry connection errors that many times, 124s but no other types of errors. Pass zero to never retry. 124s 124s If ``False``, then retries are disabled and any exception is raised 124s immediately. Also, instead of raising a MaxRetryError on redirects, 124s the redirect response will be returned. 124s 124s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 124s 124s :param redirect: 124s If True, automatically handle redirects (status codes 301, 302, 124s 303, 307, 308). Each redirect counts as a retry. Disabling retries 124s will disable redirect, too. 124s 124s :param assert_same_host: 124s If ``True``, will make sure that the host of the pool requests is 124s consistent else will raise HostChangedError. When ``False``, you can 124s use the pool on an HTTP proxy and request foreign hosts. 124s 124s :param timeout: 124s If specified, overrides the default timeout for this one 124s request. It may be a float (in seconds) or an instance of 124s :class:`urllib3.util.Timeout`. 124s 124s :param pool_timeout: 124s If set and the pool is set to block=True, then this method will 124s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 124s connection is available within the time period. 124s 124s :param bool preload_content: 124s If True, the response's body will be preloaded into memory. 124s 124s :param bool decode_content: 124s If True, will attempt to decode the body based on the 124s 'content-encoding' header. 124s 124s :param release_conn: 124s If False, then the urlopen call will not release the connection 124s back into the pool once a response is received (but will release if 124s you read the entire contents of the response such as when 124s `preload_content=True`). This is useful if you're not preloading 124s the response's content immediately. You will need to call 124s ``r.release_conn()`` on the response ``r`` to return the connection 124s back into the pool. If None, it takes the value of ``preload_content`` 124s which defaults to ``True``. 124s 124s :param bool chunked: 124s If True, urllib3 will send the body using chunked transfer 124s encoding. Otherwise, urllib3 will send the body using the standard 124s content-length form. Defaults to False. 124s 124s :param int body_pos: 124s Position to seek to in file-like body in the event of a retry or 124s redirect. Typically this won't need to be set because urllib3 will 124s auto-populate the value when needed. 124s """ 124s parsed_url = parse_url(url) 124s destination_scheme = parsed_url.scheme 124s 124s if headers is None: 124s headers = self.headers 124s 124s if not isinstance(retries, Retry): 124s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 124s 124s if release_conn is None: 124s release_conn = preload_content 124s 124s # Check host 124s if assert_same_host and not self.is_same_host(url): 124s raise HostChangedError(self, url, retries) 124s 124s # Ensure that the URL we're connecting to is properly encoded 124s if url.startswith("/"): 124s url = to_str(_encode_target(url)) 124s else: 124s url = to_str(parsed_url.url) 124s 124s conn = None 124s 124s # Track whether `conn` needs to be released before 124s # returning/raising/recursing. Update this variable if necessary, and 124s # leave `release_conn` constant throughout the function. That way, if 124s # the function recurses, the original value of `release_conn` will be 124s # passed down into the recursive call, and its value will be respected. 124s # 124s # See issue #651 [1] for details. 124s # 124s # [1] 124s release_this_conn = release_conn 124s 124s http_tunnel_required = connection_requires_http_tunnel( 124s self.proxy, self.proxy_config, destination_scheme 124s ) 124s 124s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 124s # have to copy the headers dict so we can safely change it without those 124s # changes being reflected in anyone else's copy. 124s if not http_tunnel_required: 124s headers = headers.copy() # type: ignore[attr-defined] 124s headers.update(self.proxy_headers) # type: ignore[union-attr] 124s 124s # Must keep the exception bound to a separate variable or else Python 3 124s # complains about UnboundLocalError. 124s err = None 124s 124s # Keep track of whether we cleanly exited the except block. This 124s # ensures we do proper cleanup in finally. 124s clean_exit = False 124s 124s # Rewind body position, if needed. Record current position 124s # for future rewinds in the event of a redirect/retry. 124s body_pos = set_file_position(body, body_pos) 124s 124s try: 124s # Request a connection from the queue. 124s timeout_obj = self._get_timeout(timeout) 124s conn = self._get_conn(timeout=pool_timeout) 124s 124s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 124s 124s # Is this a closed/new connection that requires CONNECT tunnelling? 124s if self.proxy is not None and http_tunnel_required and conn.is_closed: 124s try: 124s self._prepare_proxy(conn) 124s except (BaseSSLError, OSError, SocketTimeout) as e: 124s self._raise_timeout( 124s err=e, url=self.proxy.url, timeout_value=conn.timeout 124s ) 124s raise 124s 124s # If we're going to release the connection in ``finally:``, then 124s # the response doesn't need to know about the connection. Otherwise 124s # it will also try to release it and we'll have a double-release 124s # mess. 124s response_conn = conn if not release_conn else None 124s 124s # Make the request on the HTTPConnection object 124s > response = self._make_request( 124s conn, 124s method, 124s url, 124s timeout=timeout_obj, 124s body=body, 124s headers=headers, 124s chunked=chunked, 124s retries=retries, 124s response_conn=response_conn, 124s preload_content=preload_content, 124s decode_content=decode_content, 124s **response_kw, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def connect(self): 124s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 124s sock.settimeout(self.timeout) 124s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 124s > sock.connect(socket_path) 124s E FileNotFoundError: [Errno 2] No such file or directory 124s 124s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: FileNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None 124s proxies = OrderedDict({'no': '127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,p...,objectstorage.prodstack5.canonical.com', 'https': 'http://squid.internal:3128', 'http': 'http://squid.internal:3128'}) 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s > resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:486: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 124s retries = retries.increment( 124s /usr/lib/python3/dist-packages/urllib3/util/retry.py:470: in increment 124s raise reraise(type(error), error, _stacktrace) 124s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 124s raise value.with_traceback(tb) 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: in urlopen 124s response = self._make_request( 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def connect(self): 124s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 124s sock.settimeout(self.timeout) 124s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 124s > sock.connect(socket_path) 124s E urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 124s 124s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: ProtocolError 124s 124s During handling of the above exception, another exception occurred: 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s > cls.fetch_url(url) 124s 124s notebook/tests/launchnotebook.py:53: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:242: in fetch_url 124s return requests.get(url) 124s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:51: in get 124s return request('get', url, **kwargs) 124s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:46: in request 124s return session.request(method=method, url=url, **kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 124s resp = self.send(prep, **send_kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 124s r = adapter.send(request, **kwargs) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None 124s proxies = OrderedDict({'no': '127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,p...,objectstorage.prodstack5.canonical.com', 'https': 'http://squid.internal:3128', 'http': 'http://squid.internal:3128'}) 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s except (ProtocolError, OSError) as err: 124s > raise ConnectionError(err, request=request) 124s E requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:501: ConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s cls = 124s 124s @classmethod 124s def setup_class(cls): 124s cls.tmp_dir = TemporaryDirectory() 124s def tmp(*parts): 124s path = os.path.join(cls.tmp_dir.name, *parts) 124s try: 124s os.makedirs(path) 124s except OSError as e: 124s if e.errno != errno.EEXIST: 124s raise 124s return path 124s 124s cls.home_dir = tmp('home') 124s data_dir = cls.data_dir = tmp('data') 124s config_dir = cls.config_dir = tmp('config') 124s runtime_dir = cls.runtime_dir = tmp('runtime') 124s cls.notebook_dir = tmp('notebooks') 124s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 124s cls.env_patch.start() 124s # Patch systemwide & user-wide data & config directories, to isolate 124s # the tests from oddities of the local setup. But leave Python env 124s # locations alone, so data files for e.g. nbconvert are accessible. 124s # If this isolation isn't sufficient, you may need to run the tests in 124s # a virtualenv or conda env. 124s cls.path_patch = patch.multiple( 124s jupyter_core.paths, 124s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 124s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 124s ) 124s cls.path_patch.start() 124s 124s config = cls.config or Config() 124s config.NotebookNotary.db_file = ':memory:' 124s 124s cls.token = hexlify(os.urandom(4)).decode('ascii') 124s 124s started = Event() 124s def start_thread(): 124s try: 124s bind_args = cls.get_bind_args() 124s app = cls.notebook = NotebookApp( 124s port_retries=0, 124s open_browser=False, 124s config_dir=cls.config_dir, 124s data_dir=cls.data_dir, 124s runtime_dir=cls.runtime_dir, 124s notebook_dir=cls.notebook_dir, 124s base_url=cls.url_prefix, 124s config=config, 124s allow_root=True, 124s token=cls.token, 124s **bind_args 124s ) 124s if "asyncio" in sys.modules: 124s app._init_asyncio_patch() 124s import asyncio 124s 124s asyncio.set_event_loop(asyncio.new_event_loop()) 124s # Patch the current loop in order to match production 124s # behavior 124s import nest_asyncio 124s 124s nest_asyncio.apply() 124s # don't register signal handler during tests 124s app.init_signal = lambda : None 124s # clear log handlers and propagate to root for nose to capture it 124s # needs to be redone after initialize, which reconfigures logging 124s app.log.propagate = True 124s app.log.handlers = [] 124s app.initialize(argv=cls.get_argv()) 124s app.log.propagate = True 124s app.log.handlers = [] 124s loop = IOLoop.current() 124s loop.add_callback(started.set) 124s app.start() 124s finally: 124s # set the event, so failure to start doesn't cause a hang 124s started.set() 124s app.session_manager.close() 124s cls.notebook_thread = Thread(target=start_thread) 124s cls.notebook_thread.daemon = True 124s cls.notebook_thread.start() 124s started.wait() 124s > cls.wait_until_alive() 124s 124s notebook/tests/launchnotebook.py:198: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s cls.fetch_url(url) 124s except ModuleNotFoundError as error: 124s # Errors that should be immediately thrown back to caller 124s raise error 124s except Exception as e: 124s if not cls.notebook_thread.is_alive(): 124s > raise RuntimeError("The notebook server failed to start") from e 124s E RuntimeError: The notebook server failed to start 124s 124s notebook/tests/launchnotebook.py:59: RuntimeError 124s ______________ ERROR at setup of NotebookUnixSocketTests.test_run ______________ 124s 124s self = 124s method = 'GET', url = '/a%40b/api/contents', body = None 124s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 124s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s redirect = False, assert_same_host = False 124s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 124s release_conn = False, chunked = False, body_pos = None, preload_content = False 124s decode_content = False, response_kw = {} 124s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 124s destination_scheme = None, conn = None, release_this_conn = True 124s http_tunnel_required = False, err = None, clean_exit = False 124s 124s def urlopen( # type: ignore[override] 124s self, 124s method: str, 124s url: str, 124s body: _TYPE_BODY | None = None, 124s headers: typing.Mapping[str, str] | None = None, 124s retries: Retry | bool | int | None = None, 124s redirect: bool = True, 124s assert_same_host: bool = True, 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s pool_timeout: int | None = None, 124s release_conn: bool | None = None, 124s chunked: bool = False, 124s body_pos: _TYPE_BODY_POSITION | None = None, 124s preload_content: bool = True, 124s decode_content: bool = True, 124s **response_kw: typing.Any, 124s ) -> BaseHTTPResponse: 124s """ 124s Get a connection from the pool and perform an HTTP request. This is the 124s lowest level call for making a request, so you'll need to specify all 124s the raw details. 124s 124s .. note:: 124s 124s More commonly, it's appropriate to use a convenience method 124s such as :meth:`request`. 124s 124s .. note:: 124s 124s `release_conn` will only behave as expected if 124s `preload_content=False` because we want to make 124s `preload_content=False` the default behaviour someday soon without 124s breaking backwards compatibility. 124s 124s :param method: 124s HTTP request method (such as GET, POST, PUT, etc.) 124s 124s :param url: 124s The URL to perform the request on. 124s 124s :param body: 124s Data to send in the request body, either :class:`str`, :class:`bytes`, 124s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 124s 124s :param headers: 124s Dictionary of custom headers to send, such as User-Agent, 124s If-None-Match, etc. If None, pool headers are used. If provided, 124s these headers completely replace any pool-specific headers. 124s 124s :param retries: 124s Configure the number of retries to allow before raising a 124s :class:`~urllib3.exceptions.MaxRetryError` exception. 124s 124s Pass ``None`` to retry until you receive a response. Pass a 124s :class:`~urllib3.util.retry.Retry` object for fine-grained control 124s over different types of retries. 124s Pass an integer number to retry connection errors that many times, 124s but no other types of errors. Pass zero to never retry. 124s 124s If ``False``, then retries are disabled and any exception is raised 124s immediately. Also, instead of raising a MaxRetryError on redirects, 124s the redirect response will be returned. 124s 124s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 124s 124s :param redirect: 124s If True, automatically handle redirects (status codes 301, 302, 124s 303, 307, 308). Each redirect counts as a retry. Disabling retries 124s will disable redirect, too. 124s 124s :param assert_same_host: 124s If ``True``, will make sure that the host of the pool requests is 124s consistent else will raise HostChangedError. When ``False``, you can 124s use the pool on an HTTP proxy and request foreign hosts. 124s 124s :param timeout: 124s If specified, overrides the default timeout for this one 124s request. It may be a float (in seconds) or an instance of 124s :class:`urllib3.util.Timeout`. 124s 124s :param pool_timeout: 124s If set and the pool is set to block=True, then this method will 124s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 124s connection is available within the time period. 124s 124s :param bool preload_content: 124s If True, the response's body will be preloaded into memory. 124s 124s :param bool decode_content: 124s If True, will attempt to decode the body based on the 124s 'content-encoding' header. 124s 124s :param release_conn: 124s If False, then the urlopen call will not release the connection 124s back into the pool once a response is received (but will release if 124s you read the entire contents of the response such as when 124s `preload_content=True`). This is useful if you're not preloading 124s the response's content immediately. You will need to call 124s ``r.release_conn()`` on the response ``r`` to return the connection 124s back into the pool. If None, it takes the value of ``preload_content`` 124s which defaults to ``True``. 124s 124s :param bool chunked: 124s If True, urllib3 will send the body using chunked transfer 124s encoding. Otherwise, urllib3 will send the body using the standard 124s content-length form. Defaults to False. 124s 124s :param int body_pos: 124s Position to seek to in file-like body in the event of a retry or 124s redirect. Typically this won't need to be set because urllib3 will 124s auto-populate the value when needed. 124s """ 124s parsed_url = parse_url(url) 124s destination_scheme = parsed_url.scheme 124s 124s if headers is None: 124s headers = self.headers 124s 124s if not isinstance(retries, Retry): 124s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 124s 124s if release_conn is None: 124s release_conn = preload_content 124s 124s # Check host 124s if assert_same_host and not self.is_same_host(url): 124s raise HostChangedError(self, url, retries) 124s 124s # Ensure that the URL we're connecting to is properly encoded 124s if url.startswith("/"): 124s url = to_str(_encode_target(url)) 124s else: 124s url = to_str(parsed_url.url) 124s 124s conn = None 124s 124s # Track whether `conn` needs to be released before 124s # returning/raising/recursing. Update this variable if necessary, and 124s # leave `release_conn` constant throughout the function. That way, if 124s # the function recurses, the original value of `release_conn` will be 124s # passed down into the recursive call, and its value will be respected. 124s # 124s # See issue #651 [1] for details. 124s # 124s # [1] 124s release_this_conn = release_conn 124s 124s http_tunnel_required = connection_requires_http_tunnel( 124s self.proxy, self.proxy_config, destination_scheme 124s ) 124s 124s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 124s # have to copy the headers dict so we can safely change it without those 124s # changes being reflected in anyone else's copy. 124s if not http_tunnel_required: 124s headers = headers.copy() # type: ignore[attr-defined] 124s headers.update(self.proxy_headers) # type: ignore[union-attr] 124s 124s # Must keep the exception bound to a separate variable or else Python 3 124s # complains about UnboundLocalError. 124s err = None 124s 124s # Keep track of whether we cleanly exited the except block. This 124s # ensures we do proper cleanup in finally. 124s clean_exit = False 124s 124s # Rewind body position, if needed. Record current position 124s # for future rewinds in the event of a redirect/retry. 124s body_pos = set_file_position(body, body_pos) 124s 124s try: 124s # Request a connection from the queue. 124s timeout_obj = self._get_timeout(timeout) 124s conn = self._get_conn(timeout=pool_timeout) 124s 124s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 124s 124s # Is this a closed/new connection that requires CONNECT tunnelling? 124s if self.proxy is not None and http_tunnel_required and conn.is_closed: 124s try: 124s self._prepare_proxy(conn) 124s except (BaseSSLError, OSError, SocketTimeout) as e: 124s self._raise_timeout( 124s err=e, url=self.proxy.url, timeout_value=conn.timeout 124s ) 124s raise 124s 124s # If we're going to release the connection in ``finally:``, then 124s # the response doesn't need to know about the connection. Otherwise 124s # it will also try to release it and we'll have a double-release 124s # mess. 124s response_conn = conn if not release_conn else None 124s 124s # Make the request on the HTTPConnection object 124s > response = self._make_request( 124s conn, 124s method, 124s url, 124s timeout=timeout_obj, 124s body=body, 124s headers=headers, 124s chunked=chunked, 124s retries=retries, 124s response_conn=response_conn, 124s preload_content=preload_content, 124s decode_content=decode_content, 124s **response_kw, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def connect(self): 124s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 124s sock.settimeout(self.timeout) 124s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 124s > sock.connect(socket_path) 124s E FileNotFoundError: [Errno 2] No such file or directory 124s 124s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: FileNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None 124s proxies = OrderedDict({'no': '127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,p...,objectstorage.prodstack5.canonical.com', 'https': 'http://squid.internal:3128', 'http': 'http://squid.internal:3128'}) 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s > resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:486: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 124s retries = retries.increment( 124s /usr/lib/python3/dist-packages/urllib3/util/retry.py:470: in increment 124s raise reraise(type(error), error, _stacktrace) 124s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 124s raise value.with_traceback(tb) 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: in urlopen 124s response = self._make_request( 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def connect(self): 124s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 124s sock.settimeout(self.timeout) 124s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 124s > sock.connect(socket_path) 124s E urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 124s 124s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: ProtocolError 124s 124s During handling of the above exception, another exception occurred: 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s > cls.fetch_url(url) 124s 124s notebook/tests/launchnotebook.py:53: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:242: in fetch_url 124s return requests.get(url) 124s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:51: in get 124s return request('get', url, **kwargs) 124s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:46: in request 124s return session.request(method=method, url=url, **kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 124s resp = self.send(prep, **send_kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 124s r = adapter.send(request, **kwargs) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None 124s proxies = OrderedDict({'no': '127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,p...,objectstorage.prodstack5.canonical.com', 'https': 'http://squid.internal:3128', 'http': 'http://squid.internal:3128'}) 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s except (ProtocolError, OSError) as err: 124s > raise ConnectionError(err, request=request) 124s E requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:501: ConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s cls = 124s 124s @classmethod 124s def setup_class(cls): 124s cls.tmp_dir = TemporaryDirectory() 124s def tmp(*parts): 124s path = os.path.join(cls.tmp_dir.name, *parts) 124s try: 124s os.makedirs(path) 124s except OSError as e: 124s if e.errno != errno.EEXIST: 124s raise 124s return path 124s 124s cls.home_dir = tmp('home') 124s data_dir = cls.data_dir = tmp('data') 124s config_dir = cls.config_dir = tmp('config') 124s runtime_dir = cls.runtime_dir = tmp('runtime') 124s cls.notebook_dir = tmp('notebooks') 124s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 124s cls.env_patch.start() 124s # Patch systemwide & user-wide data & config directories, to isolate 124s # the tests from oddities of the local setup. But leave Python env 124s # locations alone, so data files for e.g. nbconvert are accessible. 124s # If this isolation isn't sufficient, you may need to run the tests in 124s # a virtualenv or conda env. 124s cls.path_patch = patch.multiple( 124s jupyter_core.paths, 124s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 124s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 124s ) 124s cls.path_patch.start() 124s 124s config = cls.config or Config() 124s config.NotebookNotary.db_file = ':memory:' 124s 124s cls.token = hexlify(os.urandom(4)).decode('ascii') 124s 124s started = Event() 124s def start_thread(): 124s try: 124s bind_args = cls.get_bind_args() 124s app = cls.notebook = NotebookApp( 124s port_retries=0, 124s open_browser=False, 124s config_dir=cls.config_dir, 124s data_dir=cls.data_dir, 124s runtime_dir=cls.runtime_dir, 124s notebook_dir=cls.notebook_dir, 124s base_url=cls.url_prefix, 124s config=config, 124s allow_root=True, 124s token=cls.token, 124s **bind_args 124s ) 124s if "asyncio" in sys.modules: 124s app._init_asyncio_patch() 124s import asyncio 124s 124s asyncio.set_event_loop(asyncio.new_event_loop()) 124s # Patch the current loop in order to match production 124s # behavior 124s import nest_asyncio 124s 124s nest_asyncio.apply() 124s # don't register signal handler during tests 124s app.init_signal = lambda : None 124s # clear log handlers and propagate to root for nose to capture it 124s # needs to be redone after initialize, which reconfigures logging 124s app.log.propagate = True 124s app.log.handlers = [] 124s app.initialize(argv=cls.get_argv()) 124s app.log.propagate = True 124s app.log.handlers = [] 124s loop = IOLoop.current() 124s loop.add_callback(started.set) 124s app.start() 124s finally: 124s # set the event, so failure to start doesn't cause a hang 124s started.set() 124s app.session_manager.close() 124s cls.notebook_thread = Thread(target=start_thread) 124s cls.notebook_thread.daemon = True 124s cls.notebook_thread.start() 124s started.wait() 124s > cls.wait_until_alive() 124s 124s notebook/tests/launchnotebook.py:198: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s cls.fetch_url(url) 124s except ModuleNotFoundError as error: 124s # Errors that should be immediately thrown back to caller 124s raise error 124s except Exception as e: 124s if not cls.notebook_thread.is_alive(): 124s > raise RuntimeError("The notebook server failed to start") from e 124s E RuntimeError: The notebook server failed to start 124s 124s notebook/tests/launchnotebook.py:59: RuntimeError 124s _____ ERROR at setup of NotebookAppJSONLoggingTests.test_log_json_enabled ______ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s > sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 124s raise err 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s address = ('localhost', 12341), timeout = None, source_address = None 124s socket_options = [(6, 1, 1)] 124s 124s def create_connection( 124s address: tuple[str, int], 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s source_address: tuple[str, int] | None = None, 124s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 124s ) -> socket.socket: 124s """Connect to *address* and return the socket object. 124s 124s Convenience function. Connect to *address* (a 2-tuple ``(host, 124s port)``) and return the socket object. Passing the optional 124s *timeout* parameter will set the timeout on the socket instance 124s before attempting to connect. If no *timeout* is supplied, the 124s global default timeout setting returned by :func:`socket.getdefaulttimeout` 124s is used. If *source_address* is set it must be a tuple of (host, port) 124s for the socket to bind as a source address before making the connection. 124s An host of '' or port 0 tells the OS to use the default. 124s """ 124s 124s host, port = address 124s if host.startswith("["): 124s host = host.strip("[]") 124s err = None 124s 124s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 124s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 124s # The original create_connection function always returns all records. 124s family = allowed_gai_family() 124s 124s try: 124s host.encode("idna") 124s except UnicodeError: 124s raise LocationParseError(f"'{host}', label empty or too long") from None 124s 124s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 124s af, socktype, proto, canonname, sa = res 124s sock = None 124s try: 124s sock = socket.socket(af, socktype, proto) 124s 124s # If provided, set socket level options before connecting. 124s _set_socket_options(sock, socket_options) 124s 124s if timeout is not _DEFAULT_TIMEOUT: 124s sock.settimeout(timeout) 124s if source_address: 124s sock.bind(source_address) 124s > sock.connect(sa) 124s E ConnectionRefusedError: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s method = 'GET', url = '/a%40b/api/contents', body = None 124s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 124s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s redirect = False, assert_same_host = False 124s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 124s release_conn = False, chunked = False, body_pos = None, preload_content = False 124s decode_content = False, response_kw = {} 124s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 124s destination_scheme = None, conn = None, release_this_conn = True 124s http_tunnel_required = False, err = None, clean_exit = False 124s 124s def urlopen( # type: ignore[override] 124s self, 124s method: str, 124s url: str, 124s body: _TYPE_BODY | None = None, 124s headers: typing.Mapping[str, str] | None = None, 124s retries: Retry | bool | int | None = None, 124s redirect: bool = True, 124s assert_same_host: bool = True, 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s pool_timeout: int | None = None, 124s release_conn: bool | None = None, 124s chunked: bool = False, 124s body_pos: _TYPE_BODY_POSITION | None = None, 124s preload_content: bool = True, 124s decode_content: bool = True, 124s **response_kw: typing.Any, 124s ) -> BaseHTTPResponse: 124s """ 124s Get a connection from the pool and perform an HTTP request. This is the 124s lowest level call for making a request, so you'll need to specify all 124s the raw details. 124s 124s .. note:: 124s 124s More commonly, it's appropriate to use a convenience method 124s such as :meth:`request`. 124s 124s .. note:: 124s 124s `release_conn` will only behave as expected if 124s `preload_content=False` because we want to make 124s `preload_content=False` the default behaviour someday soon without 124s breaking backwards compatibility. 124s 124s :param method: 124s HTTP request method (such as GET, POST, PUT, etc.) 124s 124s :param url: 124s The URL to perform the request on. 124s 124s :param body: 124s Data to send in the request body, either :class:`str`, :class:`bytes`, 124s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 124s 124s :param headers: 124s Dictionary of custom headers to send, such as User-Agent, 124s If-None-Match, etc. If None, pool headers are used. If provided, 124s these headers completely replace any pool-specific headers. 124s 124s :param retries: 124s Configure the number of retries to allow before raising a 124s :class:`~urllib3.exceptions.MaxRetryError` exception. 124s 124s Pass ``None`` to retry until you receive a response. Pass a 124s :class:`~urllib3.util.retry.Retry` object for fine-grained control 124s over different types of retries. 124s Pass an integer number to retry connection errors that many times, 124s but no other types of errors. Pass zero to never retry. 124s 124s If ``False``, then retries are disabled and any exception is raised 124s immediately. Also, instead of raising a MaxRetryError on redirects, 124s the redirect response will be returned. 124s 124s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 124s 124s :param redirect: 124s If True, automatically handle redirects (status codes 301, 302, 124s 303, 307, 308). Each redirect counts as a retry. Disabling retries 124s will disable redirect, too. 124s 124s :param assert_same_host: 124s If ``True``, will make sure that the host of the pool requests is 124s consistent else will raise HostChangedError. When ``False``, you can 124s use the pool on an HTTP proxy and request foreign hosts. 124s 124s :param timeout: 124s If specified, overrides the default timeout for this one 124s request. It may be a float (in seconds) or an instance of 124s :class:`urllib3.util.Timeout`. 124s 124s :param pool_timeout: 124s If set and the pool is set to block=True, then this method will 124s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 124s connection is available within the time period. 124s 124s :param bool preload_content: 124s If True, the response's body will be preloaded into memory. 124s 124s :param bool decode_content: 124s If True, will attempt to decode the body based on the 124s 'content-encoding' header. 124s 124s :param release_conn: 124s If False, then the urlopen call will not release the connection 124s back into the pool once a response is received (but will release if 124s you read the entire contents of the response such as when 124s `preload_content=True`). This is useful if you're not preloading 124s the response's content immediately. You will need to call 124s ``r.release_conn()`` on the response ``r`` to return the connection 124s back into the pool. If None, it takes the value of ``preload_content`` 124s which defaults to ``True``. 124s 124s :param bool chunked: 124s If True, urllib3 will send the body using chunked transfer 124s encoding. Otherwise, urllib3 will send the body using the standard 124s content-length form. Defaults to False. 124s 124s :param int body_pos: 124s Position to seek to in file-like body in the event of a retry or 124s redirect. Typically this won't need to be set because urllib3 will 124s auto-populate the value when needed. 124s """ 124s parsed_url = parse_url(url) 124s destination_scheme = parsed_url.scheme 124s 124s if headers is None: 124s headers = self.headers 124s 124s if not isinstance(retries, Retry): 124s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 124s 124s if release_conn is None: 124s release_conn = preload_content 124s 124s # Check host 124s if assert_same_host and not self.is_same_host(url): 124s raise HostChangedError(self, url, retries) 124s 124s # Ensure that the URL we're connecting to is properly encoded 124s if url.startswith("/"): 124s url = to_str(_encode_target(url)) 124s else: 124s url = to_str(parsed_url.url) 124s 124s conn = None 124s 124s # Track whether `conn` needs to be released before 124s # returning/raising/recursing. Update this variable if necessary, and 124s # leave `release_conn` constant throughout the function. That way, if 124s # the function recurses, the original value of `release_conn` will be 124s # passed down into the recursive call, and its value will be respected. 124s # 124s # See issue #651 [1] for details. 124s # 124s # [1] 124s release_this_conn = release_conn 124s 124s http_tunnel_required = connection_requires_http_tunnel( 124s self.proxy, self.proxy_config, destination_scheme 124s ) 124s 124s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 124s # have to copy the headers dict so we can safely change it without those 124s # changes being reflected in anyone else's copy. 124s if not http_tunnel_required: 124s headers = headers.copy() # type: ignore[attr-defined] 124s headers.update(self.proxy_headers) # type: ignore[union-attr] 124s 124s # Must keep the exception bound to a separate variable or else Python 3 124s # complains about UnboundLocalError. 124s err = None 124s 124s # Keep track of whether we cleanly exited the except block. This 124s # ensures we do proper cleanup in finally. 124s clean_exit = False 124s 124s # Rewind body position, if needed. Record current position 124s # for future rewinds in the event of a redirect/retry. 124s body_pos = set_file_position(body, body_pos) 124s 124s try: 124s # Request a connection from the queue. 124s timeout_obj = self._get_timeout(timeout) 124s conn = self._get_conn(timeout=pool_timeout) 124s 124s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 124s 124s # Is this a closed/new connection that requires CONNECT tunnelling? 124s if self.proxy is not None and http_tunnel_required and conn.is_closed: 124s try: 124s self._prepare_proxy(conn) 124s except (BaseSSLError, OSError, SocketTimeout) as e: 124s self._raise_timeout( 124s err=e, url=self.proxy.url, timeout_value=conn.timeout 124s ) 124s raise 124s 124s # If we're going to release the connection in ``finally:``, then 124s # the response doesn't need to know about the connection. Otherwise 124s # it will also try to release it and we'll have a double-release 124s # mess. 124s response_conn = conn if not release_conn else None 124s 124s # Make the request on the HTTPConnection object 124s > response = self._make_request( 124s conn, 124s method, 124s url, 124s timeout=timeout_obj, 124s body=body, 124s headers=headers, 124s chunked=chunked, 124s retries=retries, 124s response_conn=response_conn, 124s preload_content=preload_content, 124s decode_content=decode_content, 124s **response_kw, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 124s self.sock = self._new_conn() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s except socket.gaierror as e: 124s raise NameResolutionError(self.host, self, e) from e 124s except SocketTimeout as e: 124s raise ConnectTimeoutError( 124s self, 124s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 124s ) from e 124s 124s except OSError as e: 124s > raise NewConnectionError( 124s self, f"Failed to establish a new connection: {e}" 124s ) from e 124s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s > resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:486: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 124s retries = retries.increment( 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s method = 'GET', url = '/a%40b/api/contents', response = None 124s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 124s _pool = 124s _stacktrace = 124s 124s def increment( 124s self, 124s method: str | None = None, 124s url: str | None = None, 124s response: BaseHTTPResponse | None = None, 124s error: Exception | None = None, 124s _pool: ConnectionPool | None = None, 124s _stacktrace: TracebackType | None = None, 124s ) -> Retry: 124s """Return a new Retry object with incremented retry counters. 124s 124s :param response: A response object, or None, if the server did not 124s return a response. 124s :type response: :class:`~urllib3.response.BaseHTTPResponse` 124s :param Exception error: An error encountered during the request, or 124s None if the response was received successfully. 124s 124s :return: A new ``Retry`` object. 124s """ 124s if self.total is False and error: 124s # Disabled, indicate to re-raise the error. 124s raise reraise(type(error), error, _stacktrace) 124s 124s total = self.total 124s if total is not None: 124s total -= 1 124s 124s connect = self.connect 124s read = self.read 124s redirect = self.redirect 124s status_count = self.status 124s other = self.other 124s cause = "unknown" 124s status = None 124s redirect_location = None 124s 124s if error and self._is_connection_error(error): 124s # Connect retry? 124s if connect is False: 124s raise reraise(type(error), error, _stacktrace) 124s elif connect is not None: 124s connect -= 1 124s 124s elif error and self._is_read_error(error): 124s # Read retry? 124s if read is False or method is None or not self._is_method_retryable(method): 124s raise reraise(type(error), error, _stacktrace) 124s elif read is not None: 124s read -= 1 124s 124s elif error: 124s # Other retry? 124s if other is not None: 124s other -= 1 124s 124s elif response and response.get_redirect_location(): 124s # Redirect retry? 124s if redirect is not None: 124s redirect -= 1 124s cause = "too many redirects" 124s response_redirect_location = response.get_redirect_location() 124s if response_redirect_location: 124s redirect_location = response_redirect_location 124s status = response.status 124s 124s else: 124s # Incrementing because of a server error like a 500 in 124s # status_forcelist and the given method is in the allowed_methods 124s cause = ResponseError.GENERIC_ERROR 124s if response and response.status: 124s if status_count is not None: 124s status_count -= 1 124s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 124s status = response.status 124s 124s history = self.history + ( 124s RequestHistory(method, url, error, status, redirect_location), 124s ) 124s 124s new_retry = self.new( 124s total=total, 124s connect=connect, 124s read=read, 124s redirect=redirect, 124s status=status_count, 124s other=other, 124s history=history, 124s ) 124s 124s if new_retry.is_exhausted(): 124s reason = error or ResponseError(cause) 124s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 124s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 124s 124s During handling of the above exception, another exception occurred: 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s > cls.fetch_url(url) 124s 124s notebook/tests/launchnotebook.py:53: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:82: in fetch_url 124s return requests.get(url) 124s /usr/lib/python3/dist-packages/requests/api.py:73: in get 124s return request("get", url, params=params, **kwargs) 124s /usr/lib/python3/dist-packages/requests/api.py:59: in request 124s return session.request(method=method, url=url, **kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 124s resp = self.send(prep, **send_kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 124s r = adapter.send(request, **kwargs) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s except (ProtocolError, OSError) as err: 124s raise ConnectionError(err, request=request) 124s 124s except MaxRetryError as e: 124s if isinstance(e.reason, ConnectTimeoutError): 124s # TODO: Remove this in 3.0.0: see #2811 124s if not isinstance(e.reason, NewConnectionError): 124s raise ConnectTimeout(e, request=request) 124s 124s if isinstance(e.reason, ResponseError): 124s raise RetryError(e, request=request) 124s 124s if isinstance(e.reason, _ProxyError): 124s raise ProxyError(e, request=request) 124s 124s if isinstance(e.reason, _SSLError): 124s # This branch is for urllib3 v1.22 and later. 124s raise SSLError(e, request=request) 124s 124s > raise ConnectionError(e, request=request) 124s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s cls = 124s 124s @classmethod 124s def setup_class(cls): 124s > super().setup_class() 124s 124s notebook/tests/test_notebookapp.py:212: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:198: in setup_class 124s cls.wait_until_alive() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s cls.fetch_url(url) 124s except ModuleNotFoundError as error: 124s # Errors that should be immediately thrown back to caller 124s raise error 124s except Exception as e: 124s if not cls.notebook_thread.is_alive(): 124s > raise RuntimeError("The notebook server failed to start") from e 124s E RuntimeError: The notebook server failed to start 124s 124s notebook/tests/launchnotebook.py:59: RuntimeError 124s _____ ERROR at setup of NotebookAppJSONLoggingTests.test_validate_log_json _____ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s > sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 124s raise err 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s address = ('localhost', 12341), timeout = None, source_address = None 124s socket_options = [(6, 1, 1)] 124s 124s def create_connection( 124s address: tuple[str, int], 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s source_address: tuple[str, int] | None = None, 124s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 124s ) -> socket.socket: 124s """Connect to *address* and return the socket object. 124s 124s Convenience function. Connect to *address* (a 2-tuple ``(host, 124s port)``) and return the socket object. Passing the optional 124s *timeout* parameter will set the timeout on the socket instance 124s before attempting to connect. If no *timeout* is supplied, the 124s global default timeout setting returned by :func:`socket.getdefaulttimeout` 124s is used. If *source_address* is set it must be a tuple of (host, port) 124s for the socket to bind as a source address before making the connection. 124s An host of '' or port 0 tells the OS to use the default. 124s """ 124s 124s host, port = address 124s if host.startswith("["): 124s host = host.strip("[]") 124s err = None 124s 124s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 124s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 124s # The original create_connection function always returns all records. 124s family = allowed_gai_family() 124s 124s try: 124s host.encode("idna") 124s except UnicodeError: 124s raise LocationParseError(f"'{host}', label empty or too long") from None 124s 124s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 124s af, socktype, proto, canonname, sa = res 124s sock = None 124s try: 124s sock = socket.socket(af, socktype, proto) 124s 124s # If provided, set socket level options before connecting. 124s _set_socket_options(sock, socket_options) 124s 124s if timeout is not _DEFAULT_TIMEOUT: 124s sock.settimeout(timeout) 124s if source_address: 124s sock.bind(source_address) 124s > sock.connect(sa) 124s E ConnectionRefusedError: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s method = 'GET', url = '/a%40b/api/contents', body = None 124s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 124s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s redirect = False, assert_same_host = False 124s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 124s release_conn = False, chunked = False, body_pos = None, preload_content = False 124s decode_content = False, response_kw = {} 124s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 124s destination_scheme = None, conn = None, release_this_conn = True 124s http_tunnel_required = False, err = None, clean_exit = False 124s 124s def urlopen( # type: ignore[override] 124s self, 124s method: str, 124s url: str, 124s body: _TYPE_BODY | None = None, 124s headers: typing.Mapping[str, str] | None = None, 124s retries: Retry | bool | int | None = None, 124s redirect: bool = True, 124s assert_same_host: bool = True, 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s pool_timeout: int | None = None, 124s release_conn: bool | None = None, 124s chunked: bool = False, 124s body_pos: _TYPE_BODY_POSITION | None = None, 124s preload_content: bool = True, 124s decode_content: bool = True, 124s **response_kw: typing.Any, 124s ) -> BaseHTTPResponse: 124s """ 124s Get a connection from the pool and perform an HTTP request. This is the 124s lowest level call for making a request, so you'll need to specify all 124s the raw details. 124s 124s .. note:: 124s 124s More commonly, it's appropriate to use a convenience method 124s such as :meth:`request`. 124s 124s .. note:: 124s 124s `release_conn` will only behave as expected if 124s `preload_content=False` because we want to make 124s `preload_content=False` the default behaviour someday soon without 124s breaking backwards compatibility. 124s 124s :param method: 124s HTTP request method (such as GET, POST, PUT, etc.) 124s 124s :param url: 124s The URL to perform the request on. 124s 124s :param body: 124s Data to send in the request body, either :class:`str`, :class:`bytes`, 124s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 124s 124s :param headers: 124s Dictionary of custom headers to send, such as User-Agent, 124s If-None-Match, etc. If None, pool headers are used. If provided, 124s these headers completely replace any pool-specific headers. 124s 124s :param retries: 124s Configure the number of retries to allow before raising a 124s :class:`~urllib3.exceptions.MaxRetryError` exception. 124s 124s Pass ``None`` to retry until you receive a response. Pass a 124s :class:`~urllib3.util.retry.Retry` object for fine-grained control 124s over different types of retries. 124s Pass an integer number to retry connection errors that many times, 124s but no other types of errors. Pass zero to never retry. 124s 124s If ``False``, then retries are disabled and any exception is raised 124s immediately. Also, instead of raising a MaxRetryError on redirects, 124s the redirect response will be returned. 124s 124s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 124s 124s :param redirect: 124s If True, automatically handle redirects (status codes 301, 302, 124s 303, 307, 308). Each redirect counts as a retry. Disabling retries 124s will disable redirect, too. 124s 124s :param assert_same_host: 124s If ``True``, will make sure that the host of the pool requests is 124s consistent else will raise HostChangedError. When ``False``, you can 124s use the pool on an HTTP proxy and request foreign hosts. 124s 124s :param timeout: 124s If specified, overrides the default timeout for this one 124s request. It may be a float (in seconds) or an instance of 124s :class:`urllib3.util.Timeout`. 124s 124s :param pool_timeout: 124s If set and the pool is set to block=True, then this method will 124s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 124s connection is available within the time period. 124s 124s :param bool preload_content: 124s If True, the response's body will be preloaded into memory. 124s 124s :param bool decode_content: 124s If True, will attempt to decode the body based on the 124s 'content-encoding' header. 124s 124s :param release_conn: 124s If False, then the urlopen call will not release the connection 124s back into the pool once a response is received (but will release if 124s you read the entire contents of the response such as when 124s `preload_content=True`). This is useful if you're not preloading 124s the response's content immediately. You will need to call 124s ``r.release_conn()`` on the response ``r`` to return the connection 124s back into the pool. If None, it takes the value of ``preload_content`` 124s which defaults to ``True``. 124s 124s :param bool chunked: 124s If True, urllib3 will send the body using chunked transfer 124s encoding. Otherwise, urllib3 will send the body using the standard 124s content-length form. Defaults to False. 124s 124s :param int body_pos: 124s Position to seek to in file-like body in the event of a retry or 124s redirect. Typically this won't need to be set because urllib3 will 124s auto-populate the value when needed. 124s """ 124s parsed_url = parse_url(url) 124s destination_scheme = parsed_url.scheme 124s 124s if headers is None: 124s headers = self.headers 124s 124s if not isinstance(retries, Retry): 124s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 124s 124s if release_conn is None: 124s release_conn = preload_content 124s 124s # Check host 124s if assert_same_host and not self.is_same_host(url): 124s raise HostChangedError(self, url, retries) 124s 124s # Ensure that the URL we're connecting to is properly encoded 124s if url.startswith("/"): 124s url = to_str(_encode_target(url)) 124s else: 124s url = to_str(parsed_url.url) 124s 124s conn = None 124s 124s # Track whether `conn` needs to be released before 124s # returning/raising/recursing. Update this variable if necessary, and 124s # leave `release_conn` constant throughout the function. That way, if 124s # the function recurses, the original value of `release_conn` will be 124s # passed down into the recursive call, and its value will be respected. 124s # 124s # See issue #651 [1] for details. 124s # 124s # [1] 124s release_this_conn = release_conn 124s 124s http_tunnel_required = connection_requires_http_tunnel( 124s self.proxy, self.proxy_config, destination_scheme 124s ) 124s 124s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 124s # have to copy the headers dict so we can safely change it without those 124s # changes being reflected in anyone else's copy. 124s if not http_tunnel_required: 124s headers = headers.copy() # type: ignore[attr-defined] 124s headers.update(self.proxy_headers) # type: ignore[union-attr] 124s 124s # Must keep the exception bound to a separate variable or else Python 3 124s # complains about UnboundLocalError. 124s err = None 124s 124s # Keep track of whether we cleanly exited the except block. This 124s # ensures we do proper cleanup in finally. 124s clean_exit = False 124s 124s # Rewind body position, if needed. Record current position 124s # for future rewinds in the event of a redirect/retry. 124s body_pos = set_file_position(body, body_pos) 124s 124s try: 124s # Request a connection from the queue. 124s timeout_obj = self._get_timeout(timeout) 124s conn = self._get_conn(timeout=pool_timeout) 124s 124s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 124s 124s # Is this a closed/new connection that requires CONNECT tunnelling? 124s if self.proxy is not None and http_tunnel_required and conn.is_closed: 124s try: 124s self._prepare_proxy(conn) 124s except (BaseSSLError, OSError, SocketTimeout) as e: 124s self._raise_timeout( 124s err=e, url=self.proxy.url, timeout_value=conn.timeout 124s ) 124s raise 124s 124s # If we're going to release the connection in ``finally:``, then 124s # the response doesn't need to know about the connection. Otherwise 124s # it will also try to release it and we'll have a double-release 124s # mess. 124s response_conn = conn if not release_conn else None 124s 124s # Make the request on the HTTPConnection object 124s > response = self._make_request( 124s conn, 124s method, 124s url, 124s timeout=timeout_obj, 124s body=body, 124s headers=headers, 124s chunked=chunked, 124s retries=retries, 124s response_conn=response_conn, 124s preload_content=preload_content, 124s decode_content=decode_content, 124s **response_kw, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 124s self.sock = self._new_conn() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s except socket.gaierror as e: 124s raise NameResolutionError(self.host, self, e) from e 124s except SocketTimeout as e: 124s raise ConnectTimeoutError( 124s self, 124s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 124s ) from e 124s 124s except OSError as e: 124s > raise NewConnectionError( 124s self, f"Failed to establish a new connection: {e}" 124s ) from e 124s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s > resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:486: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 124s retries = retries.increment( 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s method = 'GET', url = '/a%40b/api/contents', response = None 124s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 124s _pool = 124s _stacktrace = 124s 124s def increment( 124s self, 124s method: str | None = None, 124s url: str | None = None, 124s response: BaseHTTPResponse | None = None, 124s error: Exception | None = None, 124s _pool: ConnectionPool | None = None, 124s _stacktrace: TracebackType | None = None, 124s ) -> Retry: 124s """Return a new Retry object with incremented retry counters. 124s 124s :param response: A response object, or None, if the server did not 124s return a response. 124s :type response: :class:`~urllib3.response.BaseHTTPResponse` 124s :param Exception error: An error encountered during the request, or 124s None if the response was received successfully. 124s 124s :return: A new ``Retry`` object. 124s """ 124s if self.total is False and error: 124s # Disabled, indicate to re-raise the error. 124s raise reraise(type(error), error, _stacktrace) 124s 124s total = self.total 124s if total is not None: 124s total -= 1 124s 124s connect = self.connect 124s read = self.read 124s redirect = self.redirect 124s status_count = self.status 124s other = self.other 124s cause = "unknown" 124s status = None 124s redirect_location = None 124s 124s if error and self._is_connection_error(error): 124s # Connect retry? 124s if connect is False: 124s raise reraise(type(error), error, _stacktrace) 124s elif connect is not None: 124s connect -= 1 124s 124s elif error and self._is_read_error(error): 124s # Read retry? 124s if read is False or method is None or not self._is_method_retryable(method): 124s raise reraise(type(error), error, _stacktrace) 124s elif read is not None: 124s read -= 1 124s 124s elif error: 124s # Other retry? 124s if other is not None: 124s other -= 1 124s 124s elif response and response.get_redirect_location(): 124s # Redirect retry? 124s if redirect is not None: 124s redirect -= 1 124s cause = "too many redirects" 124s response_redirect_location = response.get_redirect_location() 124s if response_redirect_location: 124s redirect_location = response_redirect_location 124s status = response.status 124s 124s else: 124s # Incrementing because of a server error like a 500 in 124s # status_forcelist and the given method is in the allowed_methods 124s cause = ResponseError.GENERIC_ERROR 124s if response and response.status: 124s if status_count is not None: 124s status_count -= 1 124s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 124s status = response.status 124s 124s history = self.history + ( 124s RequestHistory(method, url, error, status, redirect_location), 124s ) 124s 124s new_retry = self.new( 124s total=total, 124s connect=connect, 124s read=read, 124s redirect=redirect, 124s status=status_count, 124s other=other, 124s history=history, 124s ) 124s 124s if new_retry.is_exhausted(): 124s reason = error or ResponseError(cause) 124s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 124s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 124s 124s During handling of the above exception, another exception occurred: 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s > cls.fetch_url(url) 124s 124s notebook/tests/launchnotebook.py:53: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:82: in fetch_url 124s return requests.get(url) 124s /usr/lib/python3/dist-packages/requests/api.py:73: in get 124s return request("get", url, params=params, **kwargs) 124s /usr/lib/python3/dist-packages/requests/api.py:59: in request 124s return session.request(method=method, url=url, **kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 124s resp = self.send(prep, **send_kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 124s r = adapter.send(request, **kwargs) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s except (ProtocolError, OSError) as err: 124s raise ConnectionError(err, request=request) 124s 124s except MaxRetryError as e: 124s if isinstance(e.reason, ConnectTimeoutError): 124s # TODO: Remove this in 3.0.0: see #2811 124s if not isinstance(e.reason, NewConnectionError): 124s raise ConnectTimeout(e, request=request) 124s 124s if isinstance(e.reason, ResponseError): 124s raise RetryError(e, request=request) 124s 124s if isinstance(e.reason, _ProxyError): 124s raise ProxyError(e, request=request) 124s 124s if isinstance(e.reason, _SSLError): 124s # This branch is for urllib3 v1.22 and later. 124s raise SSLError(e, request=request) 124s 124s > raise ConnectionError(e, request=request) 124s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s cls = 124s 124s @classmethod 124s def setup_class(cls): 124s > super().setup_class() 124s 124s notebook/tests/test_notebookapp.py:212: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:198: in setup_class 124s cls.wait_until_alive() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s cls.fetch_url(url) 124s except ModuleNotFoundError as error: 124s # Errors that should be immediately thrown back to caller 124s raise error 124s except Exception as e: 124s if not cls.notebook_thread.is_alive(): 124s > raise RuntimeError("The notebook server failed to start") from e 124s E RuntimeError: The notebook server failed to start 124s 124s notebook/tests/launchnotebook.py:59: RuntimeError 124s ____________ ERROR at setup of RedirectTestCase.test_trailing_slash ____________ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s > sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 124s raise err 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s address = ('localhost', 12341), timeout = None, source_address = None 124s socket_options = [(6, 1, 1)] 124s 124s def create_connection( 124s address: tuple[str, int], 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s source_address: tuple[str, int] | None = None, 124s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 124s ) -> socket.socket: 124s """Connect to *address* and return the socket object. 124s 124s Convenience function. Connect to *address* (a 2-tuple ``(host, 124s port)``) and return the socket object. Passing the optional 124s *timeout* parameter will set the timeout on the socket instance 124s before attempting to connect. If no *timeout* is supplied, the 124s global default timeout setting returned by :func:`socket.getdefaulttimeout` 124s is used. If *source_address* is set it must be a tuple of (host, port) 124s for the socket to bind as a source address before making the connection. 124s An host of '' or port 0 tells the OS to use the default. 124s """ 124s 124s host, port = address 124s if host.startswith("["): 124s host = host.strip("[]") 124s err = None 124s 124s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 124s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 124s # The original create_connection function always returns all records. 124s family = allowed_gai_family() 124s 124s try: 124s host.encode("idna") 124s except UnicodeError: 124s raise LocationParseError(f"'{host}', label empty or too long") from None 124s 124s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 124s af, socktype, proto, canonname, sa = res 124s sock = None 124s try: 124s sock = socket.socket(af, socktype, proto) 124s 124s # If provided, set socket level options before connecting. 124s _set_socket_options(sock, socket_options) 124s 124s if timeout is not _DEFAULT_TIMEOUT: 124s sock.settimeout(timeout) 124s if source_address: 124s sock.bind(source_address) 124s > sock.connect(sa) 124s E ConnectionRefusedError: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s method = 'GET', url = '/a%40b/api/contents', body = None 124s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 124s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s redirect = False, assert_same_host = False 124s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 124s release_conn = False, chunked = False, body_pos = None, preload_content = False 124s decode_content = False, response_kw = {} 124s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 124s destination_scheme = None, conn = None, release_this_conn = True 124s http_tunnel_required = False, err = None, clean_exit = False 124s 124s def urlopen( # type: ignore[override] 124s self, 124s method: str, 124s url: str, 124s body: _TYPE_BODY | None = None, 124s headers: typing.Mapping[str, str] | None = None, 124s retries: Retry | bool | int | None = None, 124s redirect: bool = True, 124s assert_same_host: bool = True, 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s pool_timeout: int | None = None, 124s release_conn: bool | None = None, 124s chunked: bool = False, 124s body_pos: _TYPE_BODY_POSITION | None = None, 124s preload_content: bool = True, 124s decode_content: bool = True, 124s **response_kw: typing.Any, 124s ) -> BaseHTTPResponse: 124s """ 124s Get a connection from the pool and perform an HTTP request. This is the 124s lowest level call for making a request, so you'll need to specify all 124s the raw details. 124s 124s .. note:: 124s 124s More commonly, it's appropriate to use a convenience method 124s such as :meth:`request`. 124s 124s .. note:: 124s 124s `release_conn` will only behave as expected if 124s `preload_content=False` because we want to make 124s `preload_content=False` the default behaviour someday soon without 124s breaking backwards compatibility. 124s 124s :param method: 124s HTTP request method (such as GET, POST, PUT, etc.) 124s 124s :param url: 124s The URL to perform the request on. 124s 124s :param body: 124s Data to send in the request body, either :class:`str`, :class:`bytes`, 124s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 124s 124s :param headers: 124s Dictionary of custom headers to send, such as User-Agent, 124s If-None-Match, etc. If None, pool headers are used. If provided, 124s these headers completely replace any pool-specific headers. 124s 124s :param retries: 124s Configure the number of retries to allow before raising a 124s :class:`~urllib3.exceptions.MaxRetryError` exception. 124s 124s Pass ``None`` to retry until you receive a response. Pass a 124s :class:`~urllib3.util.retry.Retry` object for fine-grained control 124s over different types of retries. 124s Pass an integer number to retry connection errors that many times, 124s but no other types of errors. Pass zero to never retry. 124s 124s If ``False``, then retries are disabled and any exception is raised 124s immediately. Also, instead of raising a MaxRetryError on redirects, 124s the redirect response will be returned. 124s 124s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 124s 124s :param redirect: 124s If True, automatically handle redirects (status codes 301, 302, 124s 303, 307, 308). Each redirect counts as a retry. Disabling retries 124s will disable redirect, too. 124s 124s :param assert_same_host: 124s If ``True``, will make sure that the host of the pool requests is 124s consistent else will raise HostChangedError. When ``False``, you can 124s use the pool on an HTTP proxy and request foreign hosts. 124s 124s :param timeout: 124s If specified, overrides the default timeout for this one 124s request. It may be a float (in seconds) or an instance of 124s :class:`urllib3.util.Timeout`. 124s 124s :param pool_timeout: 124s If set and the pool is set to block=True, then this method will 124s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 124s connection is available within the time period. 124s 124s :param bool preload_content: 124s If True, the response's body will be preloaded into memory. 124s 124s :param bool decode_content: 124s If True, will attempt to decode the body based on the 124s 'content-encoding' header. 124s 124s :param release_conn: 124s If False, then the urlopen call will not release the connection 124s back into the pool once a response is received (but will release if 124s you read the entire contents of the response such as when 124s `preload_content=True`). This is useful if you're not preloading 124s the response's content immediately. You will need to call 124s ``r.release_conn()`` on the response ``r`` to return the connection 124s back into the pool. If None, it takes the value of ``preload_content`` 124s which defaults to ``True``. 124s 124s :param bool chunked: 124s If True, urllib3 will send the body using chunked transfer 124s encoding. Otherwise, urllib3 will send the body using the standard 124s content-length form. Defaults to False. 124s 124s :param int body_pos: 124s Position to seek to in file-like body in the event of a retry or 124s redirect. Typically this won't need to be set because urllib3 will 124s auto-populate the value when needed. 124s """ 124s parsed_url = parse_url(url) 124s destination_scheme = parsed_url.scheme 124s 124s if headers is None: 124s headers = self.headers 124s 124s if not isinstance(retries, Retry): 124s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 124s 124s if release_conn is None: 124s release_conn = preload_content 124s 124s # Check host 124s if assert_same_host and not self.is_same_host(url): 124s raise HostChangedError(self, url, retries) 124s 124s # Ensure that the URL we're connecting to is properly encoded 124s if url.startswith("/"): 124s url = to_str(_encode_target(url)) 124s else: 124s url = to_str(parsed_url.url) 124s 124s conn = None 124s 124s # Track whether `conn` needs to be released before 124s # returning/raising/recursing. Update this variable if necessary, and 124s # leave `release_conn` constant throughout the function. That way, if 124s # the function recurses, the original value of `release_conn` will be 124s # passed down into the recursive call, and its value will be respected. 124s # 124s # See issue #651 [1] for details. 124s # 124s # [1] 124s release_this_conn = release_conn 124s 124s http_tunnel_required = connection_requires_http_tunnel( 124s self.proxy, self.proxy_config, destination_scheme 124s ) 124s 124s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 124s # have to copy the headers dict so we can safely change it without those 124s # changes being reflected in anyone else's copy. 124s if not http_tunnel_required: 124s headers = headers.copy() # type: ignore[attr-defined] 124s headers.update(self.proxy_headers) # type: ignore[union-attr] 124s 124s # Must keep the exception bound to a separate variable or else Python 3 124s # complains about UnboundLocalError. 124s err = None 124s 124s # Keep track of whether we cleanly exited the except block. This 124s # ensures we do proper cleanup in finally. 124s clean_exit = False 124s 124s # Rewind body position, if needed. Record current position 124s # for future rewinds in the event of a redirect/retry. 124s body_pos = set_file_position(body, body_pos) 124s 124s try: 124s # Request a connection from the queue. 124s timeout_obj = self._get_timeout(timeout) 124s conn = self._get_conn(timeout=pool_timeout) 124s 124s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 124s 124s # Is this a closed/new connection that requires CONNECT tunnelling? 124s if self.proxy is not None and http_tunnel_required and conn.is_closed: 124s try: 124s self._prepare_proxy(conn) 124s except (BaseSSLError, OSError, SocketTimeout) as e: 124s self._raise_timeout( 124s err=e, url=self.proxy.url, timeout_value=conn.timeout 124s ) 124s raise 124s 124s # If we're going to release the connection in ``finally:``, then 124s # the response doesn't need to know about the connection. Otherwise 124s # it will also try to release it and we'll have a double-release 124s # mess. 124s response_conn = conn if not release_conn else None 124s 124s # Make the request on the HTTPConnection object 124s > response = self._make_request( 124s conn, 124s method, 124s url, 124s timeout=timeout_obj, 124s body=body, 124s headers=headers, 124s chunked=chunked, 124s retries=retries, 124s response_conn=response_conn, 124s preload_content=preload_content, 124s decode_content=decode_content, 124s **response_kw, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 124s self.sock = self._new_conn() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s except socket.gaierror as e: 124s raise NameResolutionError(self.host, self, e) from e 124s except SocketTimeout as e: 124s raise ConnectTimeoutError( 124s self, 124s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 124s ) from e 124s 124s except OSError as e: 124s > raise NewConnectionError( 124s self, f"Failed to establish a new connection: {e}" 124s ) from e 124s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s > resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:486: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 124s retries = retries.increment( 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s method = 'GET', url = '/a%40b/api/contents', response = None 124s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 124s _pool = 124s _stacktrace = 124s 124s def increment( 124s self, 124s method: str | None = None, 124s url: str | None = None, 124s response: BaseHTTPResponse | None = None, 124s error: Exception | None = None, 124s _pool: ConnectionPool | None = None, 124s _stacktrace: TracebackType | None = None, 124s ) -> Retry: 124s """Return a new Retry object with incremented retry counters. 124s 124s :param response: A response object, or None, if the server did not 124s return a response. 124s :type response: :class:`~urllib3.response.BaseHTTPResponse` 124s :param Exception error: An error encountered during the request, or 124s None if the response was received successfully. 124s 124s :return: A new ``Retry`` object. 124s """ 124s if self.total is False and error: 124s # Disabled, indicate to re-raise the error. 124s raise reraise(type(error), error, _stacktrace) 124s 124s total = self.total 124s if total is not None: 124s total -= 1 124s 124s connect = self.connect 124s read = self.read 124s redirect = self.redirect 124s status_count = self.status 124s other = self.other 124s cause = "unknown" 124s status = None 124s redirect_location = None 124s 124s if error and self._is_connection_error(error): 124s # Connect retry? 124s if connect is False: 124s raise reraise(type(error), error, _stacktrace) 124s elif connect is not None: 124s connect -= 1 124s 124s elif error and self._is_read_error(error): 124s # Read retry? 124s if read is False or method is None or not self._is_method_retryable(method): 124s raise reraise(type(error), error, _stacktrace) 124s elif read is not None: 124s read -= 1 124s 124s elif error: 124s # Other retry? 124s if other is not None: 124s other -= 1 124s 124s elif response and response.get_redirect_location(): 124s # Redirect retry? 124s if redirect is not None: 124s redirect -= 1 124s cause = "too many redirects" 124s response_redirect_location = response.get_redirect_location() 124s if response_redirect_location: 124s redirect_location = response_redirect_location 124s status = response.status 124s 124s else: 124s # Incrementing because of a server error like a 500 in 124s # status_forcelist and the given method is in the allowed_methods 124s cause = ResponseError.GENERIC_ERROR 124s if response and response.status: 124s if status_count is not None: 124s status_count -= 1 124s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 124s status = response.status 124s 124s history = self.history + ( 124s RequestHistory(method, url, error, status, redirect_location), 124s ) 124s 124s new_retry = self.new( 124s total=total, 124s connect=connect, 124s read=read, 124s redirect=redirect, 124s status=status_count, 124s other=other, 124s history=history, 124s ) 124s 124s if new_retry.is_exhausted(): 124s reason = error or ResponseError(cause) 124s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 124s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 124s 124s During handling of the above exception, another exception occurred: 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s > cls.fetch_url(url) 124s 124s notebook/tests/launchnotebook.py:53: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:82: in fetch_url 124s return requests.get(url) 124s /usr/lib/python3/dist-packages/requests/api.py:73: in get 124s return request("get", url, params=params, **kwargs) 124s /usr/lib/python3/dist-packages/requests/api.py:59: in request 124s return session.request(method=method, url=url, **kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 124s resp = self.send(prep, **send_kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 124s r = adapter.send(request, **kwargs) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s except (ProtocolError, OSError) as err: 124s raise ConnectionError(err, request=request) 124s 124s except MaxRetryError as e: 124s if isinstance(e.reason, ConnectTimeoutError): 124s # TODO: Remove this in 3.0.0: see #2811 124s if not isinstance(e.reason, NewConnectionError): 124s raise ConnectTimeout(e, request=request) 124s 124s if isinstance(e.reason, ResponseError): 124s raise RetryError(e, request=request) 124s 124s if isinstance(e.reason, _ProxyError): 124s raise ProxyError(e, request=request) 124s 124s if isinstance(e.reason, _SSLError): 124s # This branch is for urllib3 v1.22 and later. 124s raise SSLError(e, request=request) 124s 124s > raise ConnectionError(e, request=request) 124s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s cls = 124s 124s @classmethod 124s def setup_class(cls): 124s cls.tmp_dir = TemporaryDirectory() 124s def tmp(*parts): 124s path = os.path.join(cls.tmp_dir.name, *parts) 124s try: 124s os.makedirs(path) 124s except OSError as e: 124s if e.errno != errno.EEXIST: 124s raise 124s return path 124s 124s cls.home_dir = tmp('home') 124s data_dir = cls.data_dir = tmp('data') 124s config_dir = cls.config_dir = tmp('config') 124s runtime_dir = cls.runtime_dir = tmp('runtime') 124s cls.notebook_dir = tmp('notebooks') 124s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 124s cls.env_patch.start() 124s # Patch systemwide & user-wide data & config directories, to isolate 124s # the tests from oddities of the local setup. But leave Python env 124s # locations alone, so data files for e.g. nbconvert are accessible. 124s # If this isolation isn't sufficient, you may need to run the tests in 124s # a virtualenv or conda env. 124s cls.path_patch = patch.multiple( 124s jupyter_core.paths, 124s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 124s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 124s ) 124s cls.path_patch.start() 124s 124s config = cls.config or Config() 124s config.NotebookNotary.db_file = ':memory:' 124s 124s cls.token = hexlify(os.urandom(4)).decode('ascii') 124s 124s started = Event() 124s def start_thread(): 124s try: 124s bind_args = cls.get_bind_args() 124s app = cls.notebook = NotebookApp( 124s port_retries=0, 124s open_browser=False, 124s config_dir=cls.config_dir, 124s data_dir=cls.data_dir, 124s runtime_dir=cls.runtime_dir, 124s notebook_dir=cls.notebook_dir, 124s base_url=cls.url_prefix, 124s config=config, 124s allow_root=True, 124s token=cls.token, 124s **bind_args 124s ) 124s if "asyncio" in sys.modules: 124s app._init_asyncio_patch() 124s import asyncio 124s 124s asyncio.set_event_loop(asyncio.new_event_loop()) 124s # Patch the current loop in order to match production 124s # behavior 124s import nest_asyncio 124s 124s nest_asyncio.apply() 124s # don't register signal handler during tests 124s app.init_signal = lambda : None 124s # clear log handlers and propagate to root for nose to capture it 124s # needs to be redone after initialize, which reconfigures logging 124s app.log.propagate = True 124s app.log.handlers = [] 124s app.initialize(argv=cls.get_argv()) 124s app.log.propagate = True 124s app.log.handlers = [] 124s loop = IOLoop.current() 124s loop.add_callback(started.set) 124s app.start() 124s finally: 124s # set the event, so failure to start doesn't cause a hang 124s started.set() 124s app.session_manager.close() 124s cls.notebook_thread = Thread(target=start_thread) 124s cls.notebook_thread.daemon = True 124s cls.notebook_thread.start() 124s started.wait() 124s > cls.wait_until_alive() 124s 124s notebook/tests/launchnotebook.py:198: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s cls.fetch_url(url) 124s except ModuleNotFoundError as error: 124s # Errors that should be immediately thrown back to caller 124s raise error 124s except Exception as e: 124s if not cls.notebook_thread.is_alive(): 124s > raise RuntimeError("The notebook server failed to start") from e 124s E RuntimeError: The notebook server failed to start 124s 124s notebook/tests/launchnotebook.py:59: RuntimeError 124s ___________________ ERROR at setup of TreeTest.test_redirect ___________________ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s > sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 124s raise err 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s address = ('localhost', 12341), timeout = None, source_address = None 124s socket_options = [(6, 1, 1)] 124s 124s def create_connection( 124s address: tuple[str, int], 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s source_address: tuple[str, int] | None = None, 124s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 124s ) -> socket.socket: 124s """Connect to *address* and return the socket object. 124s 124s Convenience function. Connect to *address* (a 2-tuple ``(host, 124s port)``) and return the socket object. Passing the optional 124s *timeout* parameter will set the timeout on the socket instance 124s before attempting to connect. If no *timeout* is supplied, the 124s global default timeout setting returned by :func:`socket.getdefaulttimeout` 124s is used. If *source_address* is set it must be a tuple of (host, port) 124s for the socket to bind as a source address before making the connection. 124s An host of '' or port 0 tells the OS to use the default. 124s """ 124s 124s host, port = address 124s if host.startswith("["): 124s host = host.strip("[]") 124s err = None 124s 124s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 124s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 124s # The original create_connection function always returns all records. 124s family = allowed_gai_family() 124s 124s try: 124s host.encode("idna") 124s except UnicodeError: 124s raise LocationParseError(f"'{host}', label empty or too long") from None 124s 124s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 124s af, socktype, proto, canonname, sa = res 124s sock = None 124s try: 124s sock = socket.socket(af, socktype, proto) 124s 124s # If provided, set socket level options before connecting. 124s _set_socket_options(sock, socket_options) 124s 124s if timeout is not _DEFAULT_TIMEOUT: 124s sock.settimeout(timeout) 124s if source_address: 124s sock.bind(source_address) 124s > sock.connect(sa) 124s E ConnectionRefusedError: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s method = 'GET', url = '/a%40b/api/contents', body = None 124s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 124s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s redirect = False, assert_same_host = False 124s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 124s release_conn = False, chunked = False, body_pos = None, preload_content = False 124s decode_content = False, response_kw = {} 124s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 124s destination_scheme = None, conn = None, release_this_conn = True 124s http_tunnel_required = False, err = None, clean_exit = False 124s 124s def urlopen( # type: ignore[override] 124s self, 124s method: str, 124s url: str, 124s body: _TYPE_BODY | None = None, 124s headers: typing.Mapping[str, str] | None = None, 124s retries: Retry | bool | int | None = None, 124s redirect: bool = True, 124s assert_same_host: bool = True, 124s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 124s pool_timeout: int | None = None, 124s release_conn: bool | None = None, 124s chunked: bool = False, 124s body_pos: _TYPE_BODY_POSITION | None = None, 124s preload_content: bool = True, 124s decode_content: bool = True, 124s **response_kw: typing.Any, 124s ) -> BaseHTTPResponse: 124s """ 124s Get a connection from the pool and perform an HTTP request. This is the 124s lowest level call for making a request, so you'll need to specify all 124s the raw details. 124s 124s .. note:: 124s 124s More commonly, it's appropriate to use a convenience method 124s such as :meth:`request`. 124s 124s .. note:: 124s 124s `release_conn` will only behave as expected if 124s `preload_content=False` because we want to make 124s `preload_content=False` the default behaviour someday soon without 124s breaking backwards compatibility. 124s 124s :param method: 124s HTTP request method (such as GET, POST, PUT, etc.) 124s 124s :param url: 124s The URL to perform the request on. 124s 124s :param body: 124s Data to send in the request body, either :class:`str`, :class:`bytes`, 124s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 124s 124s :param headers: 124s Dictionary of custom headers to send, such as User-Agent, 124s If-None-Match, etc. If None, pool headers are used. If provided, 124s these headers completely replace any pool-specific headers. 124s 124s :param retries: 124s Configure the number of retries to allow before raising a 124s :class:`~urllib3.exceptions.MaxRetryError` exception. 124s 124s Pass ``None`` to retry until you receive a response. Pass a 124s :class:`~urllib3.util.retry.Retry` object for fine-grained control 124s over different types of retries. 124s Pass an integer number to retry connection errors that many times, 124s but no other types of errors. Pass zero to never retry. 124s 124s If ``False``, then retries are disabled and any exception is raised 124s immediately. Also, instead of raising a MaxRetryError on redirects, 124s the redirect response will be returned. 124s 124s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 124s 124s :param redirect: 124s If True, automatically handle redirects (status codes 301, 302, 124s 303, 307, 308). Each redirect counts as a retry. Disabling retries 124s will disable redirect, too. 124s 124s :param assert_same_host: 124s If ``True``, will make sure that the host of the pool requests is 124s consistent else will raise HostChangedError. When ``False``, you can 124s use the pool on an HTTP proxy and request foreign hosts. 124s 124s :param timeout: 124s If specified, overrides the default timeout for this one 124s request. It may be a float (in seconds) or an instance of 124s :class:`urllib3.util.Timeout`. 124s 124s :param pool_timeout: 124s If set and the pool is set to block=True, then this method will 124s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 124s connection is available within the time period. 124s 124s :param bool preload_content: 124s If True, the response's body will be preloaded into memory. 124s 124s :param bool decode_content: 124s If True, will attempt to decode the body based on the 124s 'content-encoding' header. 124s 124s :param release_conn: 124s If False, then the urlopen call will not release the connection 124s back into the pool once a response is received (but will release if 124s you read the entire contents of the response such as when 124s `preload_content=True`). This is useful if you're not preloading 124s the response's content immediately. You will need to call 124s ``r.release_conn()`` on the response ``r`` to return the connection 124s back into the pool. If None, it takes the value of ``preload_content`` 124s which defaults to ``True``. 124s 124s :param bool chunked: 124s If True, urllib3 will send the body using chunked transfer 124s encoding. Otherwise, urllib3 will send the body using the standard 124s content-length form. Defaults to False. 124s 124s :param int body_pos: 124s Position to seek to in file-like body in the event of a retry or 124s redirect. Typically this won't need to be set because urllib3 will 124s auto-populate the value when needed. 124s """ 124s parsed_url = parse_url(url) 124s destination_scheme = parsed_url.scheme 124s 124s if headers is None: 124s headers = self.headers 124s 124s if not isinstance(retries, Retry): 124s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 124s 124s if release_conn is None: 124s release_conn = preload_content 124s 124s # Check host 124s if assert_same_host and not self.is_same_host(url): 124s raise HostChangedError(self, url, retries) 124s 124s # Ensure that the URL we're connecting to is properly encoded 124s if url.startswith("/"): 124s url = to_str(_encode_target(url)) 124s else: 124s url = to_str(parsed_url.url) 124s 124s conn = None 124s 124s # Track whether `conn` needs to be released before 124s # returning/raising/recursing. Update this variable if necessary, and 124s # leave `release_conn` constant throughout the function. That way, if 124s # the function recurses, the original value of `release_conn` will be 124s # passed down into the recursive call, and its value will be respected. 124s # 124s # See issue #651 [1] for details. 124s # 124s # [1] 124s release_this_conn = release_conn 124s 124s http_tunnel_required = connection_requires_http_tunnel( 124s self.proxy, self.proxy_config, destination_scheme 124s ) 124s 124s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 124s # have to copy the headers dict so we can safely change it without those 124s # changes being reflected in anyone else's copy. 124s if not http_tunnel_required: 124s headers = headers.copy() # type: ignore[attr-defined] 124s headers.update(self.proxy_headers) # type: ignore[union-attr] 124s 124s # Must keep the exception bound to a separate variable or else Python 3 124s # complains about UnboundLocalError. 124s err = None 124s 124s # Keep track of whether we cleanly exited the except block. This 124s # ensures we do proper cleanup in finally. 124s clean_exit = False 124s 124s # Rewind body position, if needed. Record current position 124s # for future rewinds in the event of a redirect/retry. 124s body_pos = set_file_position(body, body_pos) 124s 124s try: 124s # Request a connection from the queue. 124s timeout_obj = self._get_timeout(timeout) 124s conn = self._get_conn(timeout=pool_timeout) 124s 124s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 124s 124s # Is this a closed/new connection that requires CONNECT tunnelling? 124s if self.proxy is not None and http_tunnel_required and conn.is_closed: 124s try: 124s self._prepare_proxy(conn) 124s except (BaseSSLError, OSError, SocketTimeout) as e: 124s self._raise_timeout( 124s err=e, url=self.proxy.url, timeout_value=conn.timeout 124s ) 124s raise 124s 124s # If we're going to release the connection in ``finally:``, then 124s # the response doesn't need to know about the connection. Otherwise 124s # it will also try to release it and we'll have a double-release 124s # mess. 124s response_conn = conn if not release_conn else None 124s 124s # Make the request on the HTTPConnection object 124s > response = self._make_request( 124s conn, 124s method, 124s url, 124s timeout=timeout_obj, 124s body=body, 124s headers=headers, 124s chunked=chunked, 124s retries=retries, 124s response_conn=response_conn, 124s preload_content=preload_content, 124s decode_content=decode_content, 124s **response_kw, 124s ) 124s 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 124s conn.request( 124s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 124s self.endheaders() 124s /usr/lib/python3.12/http/client.py:1331: in endheaders 124s self._send_output(message_body, encode_chunked=encode_chunked) 124s /usr/lib/python3.12/http/client.py:1091: in _send_output 124s self.send(msg) 124s /usr/lib/python3.12/http/client.py:1035: in send 124s self.connect() 124s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 124s self.sock = self._new_conn() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _new_conn(self) -> socket.socket: 124s """Establish a socket connection and set nodelay settings on it. 124s 124s :return: New socket connection. 124s """ 124s try: 124s sock = connection.create_connection( 124s (self._dns_host, self.port), 124s self.timeout, 124s source_address=self.source_address, 124s socket_options=self.socket_options, 124s ) 124s except socket.gaierror as e: 124s raise NameResolutionError(self.host, self, e) from e 124s except SocketTimeout as e: 124s raise ConnectTimeoutError( 124s self, 124s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 124s ) from e 124s 124s except OSError as e: 124s > raise NewConnectionError( 124s self, f"Failed to establish a new connection: {e}" 124s ) from e 124s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 124s 124s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s > resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:486: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 124s retries = retries.increment( 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 124s method = 'GET', url = '/a%40b/api/contents', response = None 124s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 124s _pool = 124s _stacktrace = 124s 124s def increment( 124s self, 124s method: str | None = None, 124s url: str | None = None, 124s response: BaseHTTPResponse | None = None, 124s error: Exception | None = None, 124s _pool: ConnectionPool | None = None, 124s _stacktrace: TracebackType | None = None, 124s ) -> Retry: 124s """Return a new Retry object with incremented retry counters. 124s 124s :param response: A response object, or None, if the server did not 124s return a response. 124s :type response: :class:`~urllib3.response.BaseHTTPResponse` 124s :param Exception error: An error encountered during the request, or 124s None if the response was received successfully. 124s 124s :return: A new ``Retry`` object. 124s """ 124s if self.total is False and error: 124s # Disabled, indicate to re-raise the error. 124s raise reraise(type(error), error, _stacktrace) 124s 124s total = self.total 124s if total is not None: 124s total -= 1 124s 124s connect = self.connect 124s read = self.read 124s redirect = self.redirect 124s status_count = self.status 124s other = self.other 124s cause = "unknown" 124s status = None 124s redirect_location = None 124s 124s if error and self._is_connection_error(error): 124s # Connect retry? 124s if connect is False: 124s raise reraise(type(error), error, _stacktrace) 124s elif connect is not None: 124s connect -= 1 124s 124s elif error and self._is_read_error(error): 124s # Read retry? 124s if read is False or method is None or not self._is_method_retryable(method): 124s raise reraise(type(error), error, _stacktrace) 124s elif read is not None: 124s read -= 1 124s 124s elif error: 124s # Other retry? 124s if other is not None: 124s other -= 1 124s 124s elif response and response.get_redirect_location(): 124s # Redirect retry? 124s if redirect is not None: 124s redirect -= 1 124s cause = "too many redirects" 124s response_redirect_location = response.get_redirect_location() 124s if response_redirect_location: 124s redirect_location = response_redirect_location 124s status = response.status 124s 124s else: 124s # Incrementing because of a server error like a 500 in 124s # status_forcelist and the given method is in the allowed_methods 124s cause = ResponseError.GENERIC_ERROR 124s if response and response.status: 124s if status_count is not None: 124s status_count -= 1 124s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 124s status = response.status 124s 124s history = self.history + ( 124s RequestHistory(method, url, error, status, redirect_location), 124s ) 124s 124s new_retry = self.new( 124s total=total, 124s connect=connect, 124s read=read, 124s redirect=redirect, 124s status=status_count, 124s other=other, 124s history=history, 124s ) 124s 124s if new_retry.is_exhausted(): 124s reason = error or ResponseError(cause) 124s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 124s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 124s 124s During handling of the above exception, another exception occurred: 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s > cls.fetch_url(url) 124s 124s notebook/tests/launchnotebook.py:53: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s notebook/tests/launchnotebook.py:82: in fetch_url 124s return requests.get(url) 124s /usr/lib/python3/dist-packages/requests/api.py:73: in get 124s return request("get", url, params=params, **kwargs) 124s /usr/lib/python3/dist-packages/requests/api.py:59: in request 124s return session.request(method=method, url=url, **kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 124s resp = self.send(prep, **send_kwargs) 124s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 124s r = adapter.send(request, **kwargs) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s request = , stream = False 124s timeout = Timeout(connect=None, read=None, total=None), verify = True 124s cert = None, proxies = OrderedDict() 124s 124s def send( 124s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 124s ): 124s """Sends PreparedRequest object. Returns Response object. 124s 124s :param request: The :class:`PreparedRequest ` being sent. 124s :param stream: (optional) Whether to stream the request content. 124s :param timeout: (optional) How long to wait for the server to send 124s data before giving up, as a float, or a :ref:`(connect timeout, 124s read timeout) ` tuple. 124s :type timeout: float or tuple or urllib3 Timeout object 124s :param verify: (optional) Either a boolean, in which case it controls whether 124s we verify the server's TLS certificate, or a string, in which case it 124s must be a path to a CA bundle to use 124s :param cert: (optional) Any user-provided SSL certificate to be trusted. 124s :param proxies: (optional) The proxies dictionary to apply to the request. 124s :rtype: requests.Response 124s """ 124s 124s try: 124s conn = self.get_connection(request.url, proxies) 124s except LocationValueError as e: 124s raise InvalidURL(e, request=request) 124s 124s self.cert_verify(conn, request.url, verify, cert) 124s url = self.request_url(request, proxies) 124s self.add_headers( 124s request, 124s stream=stream, 124s timeout=timeout, 124s verify=verify, 124s cert=cert, 124s proxies=proxies, 124s ) 124s 124s chunked = not (request.body is None or "Content-Length" in request.headers) 124s 124s if isinstance(timeout, tuple): 124s try: 124s connect, read = timeout 124s timeout = TimeoutSauce(connect=connect, read=read) 124s except ValueError: 124s raise ValueError( 124s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 124s f"or a single float to set both timeouts to the same value." 124s ) 124s elif isinstance(timeout, TimeoutSauce): 124s pass 124s else: 124s timeout = TimeoutSauce(connect=timeout, read=timeout) 124s 124s try: 124s resp = conn.urlopen( 124s method=request.method, 124s url=url, 124s body=request.body, 124s headers=request.headers, 124s redirect=False, 124s assert_same_host=False, 124s preload_content=False, 124s decode_content=False, 124s retries=self.max_retries, 124s timeout=timeout, 124s chunked=chunked, 124s ) 124s 124s except (ProtocolError, OSError) as err: 124s raise ConnectionError(err, request=request) 124s 124s except MaxRetryError as e: 124s if isinstance(e.reason, ConnectTimeoutError): 124s # TODO: Remove this in 3.0.0: see #2811 124s if not isinstance(e.reason, NewConnectionError): 124s raise ConnectTimeout(e, request=request) 124s 124s if isinstance(e.reason, ResponseError): 124s raise RetryError(e, request=request) 124s 124s if isinstance(e.reason, _ProxyError): 124s raise ProxyError(e, request=request) 124s 124s if isinstance(e.reason, _SSLError): 124s # This branch is for urllib3 v1.22 and later. 124s raise SSLError(e, request=request) 124s 124s > raise ConnectionError(e, request=request) 124s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 124s 124s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 124s 124s The above exception was the direct cause of the following exception: 124s 124s cls = 124s 124s @classmethod 124s def setup_class(cls): 124s cls.tmp_dir = TemporaryDirectory() 124s def tmp(*parts): 124s path = os.path.join(cls.tmp_dir.name, *parts) 124s try: 124s os.makedirs(path) 124s except OSError as e: 124s if e.errno != errno.EEXIST: 124s raise 124s return path 124s 124s cls.home_dir = tmp('home') 124s data_dir = cls.data_dir = tmp('data') 124s config_dir = cls.config_dir = tmp('config') 124s runtime_dir = cls.runtime_dir = tmp('runtime') 124s cls.notebook_dir = tmp('notebooks') 124s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 124s cls.env_patch.start() 124s # Patch systemwide & user-wide data & config directories, to isolate 124s # the tests from oddities of the local setup. But leave Python env 124s # locations alone, so data files for e.g. nbconvert are accessible. 124s # If this isolation isn't sufficient, you may need to run the tests in 124s # a virtualenv or conda env. 124s cls.path_patch = patch.multiple( 124s jupyter_core.paths, 124s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 124s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 124s ) 124s cls.path_patch.start() 124s 124s config = cls.config or Config() 124s config.NotebookNotary.db_file = ':memory:' 124s 124s cls.token = hexlify(os.urandom(4)).decode('ascii') 124s 124s started = Event() 124s def start_thread(): 124s try: 124s bind_args = cls.get_bind_args() 124s app = cls.notebook = NotebookApp( 124s port_retries=0, 124s open_browser=False, 124s config_dir=cls.config_dir, 124s data_dir=cls.data_dir, 124s runtime_dir=cls.runtime_dir, 124s notebook_dir=cls.notebook_dir, 124s base_url=cls.url_prefix, 124s config=config, 124s allow_root=True, 124s token=cls.token, 124s **bind_args 124s ) 124s if "asyncio" in sys.modules: 124s app._init_asyncio_patch() 124s import asyncio 124s 124s asyncio.set_event_loop(asyncio.new_event_loop()) 124s # Patch the current loop in order to match production 124s # behavior 124s import nest_asyncio 124s 124s nest_asyncio.apply() 124s # don't register signal handler during tests 124s app.init_signal = lambda : None 124s # clear log handlers and propagate to root for nose to capture it 124s # needs to be redone after initialize, which reconfigures logging 124s app.log.propagate = True 124s app.log.handlers = [] 124s app.initialize(argv=cls.get_argv()) 124s app.log.propagate = True 124s app.log.handlers = [] 124s loop = IOLoop.current() 124s loop.add_callback(started.set) 124s app.start() 124s finally: 124s # set the event, so failure to start doesn't cause a hang 124s started.set() 124s app.session_manager.close() 124s cls.notebook_thread = Thread(target=start_thread) 124s cls.notebook_thread.daemon = True 124s cls.notebook_thread.start() 124s started.wait() 124s > cls.wait_until_alive() 124s 124s notebook/tests/launchnotebook.py:198: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s cls = 124s 124s @classmethod 124s def wait_until_alive(cls): 124s """Wait for the server to be alive""" 124s url = cls.base_url() + 'api/contents' 124s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 124s try: 124s cls.fetch_url(url) 124s except ModuleNotFoundError as error: 124s # Errors that should be immediately thrown back to caller 124s raise error 124s except Exception as e: 124s if not cls.notebook_thread.is_alive(): 124s > raise RuntimeError("The notebook server failed to start") from e 124s E RuntimeError: The notebook server failed to start 124s 124s notebook/tests/launchnotebook.py:59: RuntimeError 124s =================================== FAILURES =================================== 124s __________________ TestSessionManager.test_bad_delete_session __________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s ___________________ TestSessionManager.test_bad_get_session ____________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s __________________ TestSessionManager.test_bad_update_session __________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s ____________________ TestSessionManager.test_delete_session ____________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s _____________________ TestSessionManager.test_get_session ______________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s _______________ TestSessionManager.test_get_session_dead_kernel ________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s ____________________ TestSessionManager.test_list_sessions _____________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s ______________ TestSessionManager.test_list_sessions_dead_kernel _______________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s ____________________ TestSessionManager.test_update_session ____________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:336: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.services.contents.manager.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def setUp(self): 124s > self.sm = SessionManager( 124s kernel_manager=DummyMKM(), 124s contents_manager=ContentsManager(), 124s ) 124s 124s notebook/services/sessions/tests/test_sessionmanager.py:45: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:327: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:339: TypeError 124s _______________________________ test_help_output _______________________________ 124s 124s def test_help_output(): 124s """ipython notebook --help-all works""" 124s > check_help_all_output('notebook') 124s 124s notebook/tests/test_notebookapp.py:28: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s pkg = 'notebook', subcommand = None 124s 124s def check_help_all_output(pkg: str, subcommand: Sequence[str] | None = None) -> tuple[str, str]: 124s """test that `python -m PKG --help-all` works""" 124s cmd = [sys.executable, "-m", pkg] 124s if subcommand: 124s cmd.extend(subcommand) 124s cmd.append("--help-all") 124s out, err, rc = get_output_error_code(cmd) 124s > assert rc == 0, err 124s E AssertionError: Traceback (most recent call last): 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s E klass = self._resolve_string(klass) 124s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s E return import_item(string) 124s E ^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s E module = __import__(package, fromlist=[obj]) 124s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s E 124s E During handling of the above exception, another exception occurred: 124s E 124s E Traceback (most recent call last): 124s E File "", line 198, in _run_module_as_main 124s E File "", line 88, in _run_code 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/__main__.py", line 3, in 124s E app.launch_new_instance() 124s E File "/usr/lib/python3/dist-packages/jupyter_core/application.py", line 282, in launch_instance 124s E super().launch_instance(argv=argv, **kwargs) 124s E File "/usr/lib/python3/dist-packages/traitlets/config/application.py", line 1073, in launch_instance 124s E app = cls.instance(**kwargs) 124s E ^^^^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/config/configurable.py", line 583, in instance 124s E inst = cls(*args, **kwargs) 124s E ^^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s E inst.setup_instance(*args, **kwargs) 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s E super(HasTraits, self).setup_instance(*args, **kwargs) 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s E init(self) 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s E self._resolve_classes() 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s E warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s /usr/lib/python3/dist-packages/traitlets/tests/utils.py:38: AssertionError 124s ____________________________ test_server_info_file _____________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_server_info_file(): 124s td = TemporaryDirectory() 124s > nbapp = NotebookApp(runtime_dir=td.name, log=logging.getLogger()) 124s 124s notebook/tests/test_notebookapp.py:32: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _________________________________ test_nb_dir __________________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_nb_dir(): 124s with TemporaryDirectory() as td: 124s > app = NotebookApp(notebook_dir=td) 124s 124s notebook/tests/test_notebookapp.py:49: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s ____________________________ test_no_create_nb_dir _____________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_no_create_nb_dir(): 124s with TemporaryDirectory() as td: 124s nbdir = os.path.join(td, 'notebooks') 124s > app = NotebookApp() 124s 124s notebook/tests/test_notebookapp.py:55: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _____________________________ test_missing_nb_dir ______________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_missing_nb_dir(): 124s with TemporaryDirectory() as td: 124s nbdir = os.path.join(td, 'notebook', 'dir', 'is', 'missing') 124s > app = NotebookApp() 124s 124s notebook/tests/test_notebookapp.py:62: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _____________________________ test_invalid_nb_dir ______________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_invalid_nb_dir(): 124s with NamedTemporaryFile() as tf: 124s > app = NotebookApp() 124s 124s notebook/tests/test_notebookapp.py:68: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s ____________________________ test_nb_dir_with_slash ____________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_nb_dir_with_slash(): 124s with TemporaryDirectory(suffix="_slash" + os.sep) as td: 124s > app = NotebookApp(notebook_dir=td) 124s 124s notebook/tests/test_notebookapp.py:74: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _______________________________ test_nb_dir_root _______________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_nb_dir_root(): 124s root = os.path.abspath(os.sep) # gets the right value on Windows, Posix 124s > app = NotebookApp(notebook_dir=root) 124s 124s notebook/tests/test_notebookapp.py:79: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _____________________________ test_generate_config _____________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_generate_config(): 124s with TemporaryDirectory() as td: 124s > app = NotebookApp(config_dir=td) 124s 124s notebook/tests/test_notebookapp.py:84: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s ____________________________ test_notebook_password ____________________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s def test_notebook_password(): 124s password = 'secret' 124s with TemporaryDirectory() as td: 124s with patch.dict('os.environ', { 124s 'JUPYTER_CONFIG_DIR': td, 124s }), patch.object(getpass, 'getpass', return_value=password): 124s app = notebookapp.NotebookPasswordApp(log_level=logging.ERROR) 124s app.initialize([]) 124s app.start() 124s > nb = NotebookApp() 124s 124s notebook/tests/test_notebookapp.py:133: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _________________ TestInstallServerExtension.test_merge_config _________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def test_merge_config(self): 124s # enabled at sys level 124s mock_sys = self._inject_mock_extension('mockext_sys') 124s # enabled at sys, disabled at user 124s mock_both = self._inject_mock_extension('mockext_both') 124s # enabled at user 124s mock_user = self._inject_mock_extension('mockext_user') 124s # enabled at Python 124s mock_py = self._inject_mock_extension('mockext_py') 124s 124s toggle_serverextension_python('mockext_sys', enabled=True, user=False) 124s toggle_serverextension_python('mockext_user', enabled=True, user=True) 124s toggle_serverextension_python('mockext_both', enabled=True, user=False) 124s toggle_serverextension_python('mockext_both', enabled=False, user=True) 124s 124s > app = NotebookApp(nbserver_extensions={'mockext_py': True}) 124s 124s notebook/tests/test_serverextensions.py:147: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _________________ TestOrderedServerExtension.test_load_ordered _________________ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s > klass = self._resolve_string(klass) 124s 124s notebook/traittypes.py:235: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 124s return import_item(string) 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s name = 'jupyter_server.contents.services.managers.ContentsManager' 124s 124s def import_item(name: str) -> Any: 124s """Import and return ``bar`` given the string ``foo.bar``. 124s 124s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 124s executing the code ``from foo import bar``. 124s 124s Parameters 124s ---------- 124s name : string 124s The fully qualified name of the module/package being imported. 124s 124s Returns 124s ------- 124s mod : module object 124s The module that was imported. 124s """ 124s if not isinstance(name, str): 124s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 124s parts = name.rsplit(".", 1) 124s if len(parts) == 2: 124s # called with 'foo.bar....' 124s package, obj = parts 124s > module = __import__(package, fromlist=[obj]) 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s 124s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 124s 124s During handling of the above exception, another exception occurred: 124s 124s self = 124s 124s def test_load_ordered(self): 124s > app = NotebookApp() 124s 124s notebook/tests/test_serverextensions.py:189: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 124s inst.setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 124s init(self) 124s notebook/traittypes.py:226: in instance_init 124s self._resolve_classes() 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s self = 124s 124s def _resolve_classes(self): 124s # Resolve all string names to actual classes. 124s self.importable_klasses = [] 124s for klass in self.klasses: 124s if isinstance(klass, str): 124s try: 124s klass = self._resolve_string(klass) 124s self.importable_klasses.append(klass) 124s except: 124s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s notebook/traittypes.py:238: TypeError 124s _______________________________ test_help_output _______________________________ 124s 124s def test_help_output(): 124s """jupyter notebook --help-all works""" 124s # FIXME: will be notebook 124s > check_help_all_output('notebook') 124s 124s notebook/tests/test_utils.py:21: 124s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 124s 124s pkg = 'notebook', subcommand = None 124s 124s def check_help_all_output(pkg: str, subcommand: Sequence[str] | None = None) -> tuple[str, str]: 124s """test that `python -m PKG --help-all` works""" 124s cmd = [sys.executable, "-m", pkg] 124s if subcommand: 124s cmd.extend(subcommand) 124s cmd.append("--help-all") 124s out, err, rc = get_output_error_code(cmd) 124s > assert rc == 0, err 124s E AssertionError: Traceback (most recent call last): 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s E klass = self._resolve_string(klass) 124s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s E return import_item(string) 124s E ^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s E module = __import__(package, fromlist=[obj]) 124s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s E ModuleNotFoundError: No module named 'jupyter_server' 124s E 124s E During handling of the above exception, another exception occurred: 124s E 124s E Traceback (most recent call last): 124s E File "", line 198, in _run_module_as_main 124s E File "", line 88, in _run_code 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/__main__.py", line 3, in 124s E app.launch_new_instance() 124s E File "/usr/lib/python3/dist-packages/jupyter_core/application.py", line 282, in launch_instance 124s E super().launch_instance(argv=argv, **kwargs) 124s E File "/usr/lib/python3/dist-packages/traitlets/config/application.py", line 1073, in launch_instance 124s E app = cls.instance(**kwargs) 124s E ^^^^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/config/configurable.py", line 583, in instance 124s E inst = cls(*args, **kwargs) 124s E ^^^^^^^^^^^^^^^^^^^^ 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s E inst.setup_instance(*args, **kwargs) 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s E super(HasTraits, self).setup_instance(*args, **kwargs) 124s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s E init(self) 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s E self._resolve_classes() 124s E File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s E warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s /usr/lib/python3/dist-packages/traitlets/tests/utils.py:38: AssertionError 124s =============================== warnings summary =============================== 124s notebook/nbextensions.py:15 124s /tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/nbextensions.py:15: DeprecationWarning: Jupyter is migrating its paths to use standard platformdirs 124s given by the platformdirs library. To remove this warning and 124s see the appropriate new directories, set the environment variable 124s `JUPYTER_PLATFORM_DIRS=1` and then run `jupyter --paths`. 124s The use of platformdirs will be the default in `jupyter_core` v6 124s from jupyter_core.paths import ( 124s 124s notebook/utils.py:280 124s notebook/utils.py:280 124s /tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/utils.py:280: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead. 124s return LooseVersion(v) >= LooseVersion(check) 124s 124s notebook/_tz.py:29: 1 warning 124s notebook/services/sessions/tests/test_sessionmanager.py: 9 warnings 124s /tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/_tz.py:29: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC). 124s dt = unaware(*args, **kwargs) 124s 124s notebook/tests/test_notebookapp_integration.py:14 124s /tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/test_notebookapp_integration.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.integration_tests - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 124s pytestmark = pytest.mark.integration_tests 124s 124s notebook/auth/tests/test_login.py::LoginTest::test_next_bad 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-1 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_import_error 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-2 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/api/tests/test_api.py::APITest::test_get_spec 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-3 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/config/tests/test_config_api.py::APITest::test_create_retrieve_config 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-4 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/contents/tests/test_contents_api.py::APITest::test_checkpoints 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-5 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_checkpoints 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-6 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/contents/tests/test_largefilemanager.py: 42 warnings 124s notebook/services/contents/tests/test_manager.py: 526 warnings 124s /tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/_tz.py:29: DeprecationWarning: datetime.datetime.utcfromtimestamp() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.fromtimestamp(timestamp, datetime.UTC). 124s dt = unaware(*args, **kwargs) 124s 124s notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_connections 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-7 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_connections 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-8 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/kernels/tests/test_kernels_api.py::KernelFilterTest::test_config 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-9 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/kernels/tests/test_kernels_api.py::KernelCullingTest::test_culling 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-10 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernel_resource_file 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-11 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/nbconvert/tests/test_nbconvert_api.py::APITest::test_list_formats 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-12 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-13 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-14 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-15 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/terminal/tests/test_terminals_api.py::TerminalCullingTest::test_config 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-16 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/tests/test_files.py::FilesTest::test_contents_manager 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-17 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/tests/test_gateway.py::TestGateway::test_gateway_class_mappings 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-18 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/tests/test_nbextensions.py::TestInstallNBExtension::test_install_tar 124s notebook/tests/test_nbextensions.py::TestInstallNBExtension::test_install_tar 124s notebook/tests/test_nbextensions.py::TestInstallNBExtension::test_install_tar 124s /tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/nbextensions.py:154: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 124s archive.extractall(nbext) 124s 124s notebook/tests/test_notebookapp.py::NotebookAppTests::test_list_running_servers 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-19 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/tests/test_notebookapp.py::NotebookUnixSocketTests::test_list_running_sock_servers 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-20 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/tests/test_notebookapp.py::NotebookAppJSONLoggingTests::test_log_json_enabled 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-21 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/tests/test_paths.py::RedirectTestCase::test_trailing_slash 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-22 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s notebook/tree/tests/test_tree_handler.py::TreeTest::test_redirect 124s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-23 (start_thread) 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 235, in _resolve_classes 124s klass = self._resolve_string(klass) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 124s return import_item(string) 124s ^^^^^^^^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 124s module = __import__(package, fromlist=[obj]) 124s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 124s ModuleNotFoundError: No module named 'jupyter_server' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 155, in start_thread 124s app = cls.notebook = NotebookApp( 124s ^^^^^^^^^^^^ 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 124s inst.setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 124s super(HasTraits, self).setup_instance(*args, **kwargs) 124s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 124s init(self) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 226, in instance_init 124s self._resolve_classes() 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/traittypes.py", line 238, in _resolve_classes 124s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 124s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 124s 124s During handling of the above exception, another exception occurred: 124s 124s Traceback (most recent call last): 124s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 124s self.run() 124s File "/usr/lib/python3.12/threading.py", line 1010, in run 124s self._target(*self._args, **self._kwargs) 124s File "/tmp/autopkgtest.iPdHX2/build.Zbq/src/notebook/tests/launchnotebook.py", line 193, in start_thread 124s app.session_manager.close() 124s ^^^ 124s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 124s 124s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 124s 124s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 124s =========================== short test summary info ============================ 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_bad_delete_session 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_bad_get_session 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_bad_update_session 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_delete_session 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_get_session 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_get_session_dead_kernel 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_list_sessions 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_list_sessions_dead_kernel 124s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_update_session 124s FAILED notebook/tests/test_notebookapp.py::test_help_output - AssertionError:... 124s FAILED notebook/tests/test_notebookapp.py::test_server_info_file - TypeError:... 124s FAILED notebook/tests/test_notebookapp.py::test_nb_dir - TypeError: warn() mi... 124s FAILED notebook/tests/test_notebookapp.py::test_no_create_nb_dir - TypeError:... 124s FAILED notebook/tests/test_notebookapp.py::test_missing_nb_dir - TypeError: w... 124s FAILED notebook/tests/test_notebookapp.py::test_invalid_nb_dir - TypeError: w... 124s FAILED notebook/tests/test_notebookapp.py::test_nb_dir_with_slash - TypeError... 124s FAILED notebook/tests/test_notebookapp.py::test_nb_dir_root - TypeError: warn... 124s FAILED notebook/tests/test_notebookapp.py::test_generate_config - TypeError: ... 124s FAILED notebook/tests/test_notebookapp.py::test_notebook_password - TypeError... 124s FAILED notebook/tests/test_serverextensions.py::TestInstallServerExtension::test_merge_config 124s FAILED notebook/tests/test_serverextensions.py::TestOrderedServerExtension::test_load_ordered 124s FAILED notebook/tests/test_utils.py::test_help_output - AssertionError: Trace... 124s ERROR notebook/auth/tests/test_login.py::LoginTest::test_next_bad - RuntimeEr... 124s ERROR notebook/auth/tests/test_login.py::LoginTest::test_next_ok - RuntimeErr... 124s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_import_error 124s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_invoke 124s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_not_enabled 124s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_missing_bundler_arg 124s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_notebook_not_found 124s ERROR notebook/services/api/tests/test_api.py::APITest::test_get_spec - Runti... 124s ERROR notebook/services/api/tests/test_api.py::APITest::test_get_status - Run... 124s ERROR notebook/services/api/tests/test_api.py::APITest::test_no_track_activity 124s ERROR notebook/services/config/tests/test_config_api.py::APITest::test_create_retrieve_config 124s ERROR notebook/services/config/tests/test_config_api.py::APITest::test_get_unknown 124s ERROR notebook/services/config/tests/test_config_api.py::APITest::test_modify 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_checkpoints 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_checkpoints_separate_root 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_400_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_copy 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_dir_400 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_path 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_put_400 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_put_400_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_create_untitled 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_create_untitled_txt 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_delete_hidden_dir 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_delete_hidden_file 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_file_checkpoints 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_404_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_bad_type 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_binary_file_contents 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_contents_no_such_file 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_dir_no_content 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_nb_contents 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_nb_invalid 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_nb_no_content 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_text_file_contents 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_list_dirs 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_list_nonexistant_dir 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_list_notebooks 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_mkdir 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_mkdir_hidden_400 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_mkdir_untitled 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_rename 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_rename_400_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_rename_existing 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_save 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_b64 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_txt 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_txt_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_v2 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_checkpoints 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_checkpoints_separate_root 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_config_did_something 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_400_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_copy 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_dir_400 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_path 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_put_400 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_put_400_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_create_untitled 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_create_untitled_txt 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_delete_hidden_dir 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_delete_hidden_file 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_file_checkpoints 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_404_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_bad_type 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_binary_file_contents 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_contents_no_such_file 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_dir_no_content 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_nb_contents 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_nb_invalid 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_nb_no_content 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_text_file_contents 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_list_dirs 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_list_nonexistant_dir 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_list_notebooks 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_mkdir 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_mkdir_hidden_400 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_mkdir_untitled 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_rename 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_rename_400_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_rename_existing 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_save 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_b64 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_txt 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_txt_hidden 124s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_v2 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_connections 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_default_kernel 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_kernel_handler 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_main_kernel_handler 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_no_kernels 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_connections 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_default_kernel 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_kernel_handler 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_main_kernel_handler 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_no_kernels 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelFilterTest::test_config 124s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelCullingTest::test_culling 124s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernel_resource_file 124s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernelspec 124s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernelspec_spaces 124s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_nonexistant_kernelspec 124s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_nonexistant_resource 124s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_list_kernelspecs 124s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_list_kernelspecs_bad 124s ERROR notebook/services/nbconvert/tests/test_nbconvert_api.py::APITest::test_list_formats 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_console_session 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_deprecated 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_file_session 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_with_kernel_id 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_delete 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_kernel_id 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_kernel_name 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_path 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_path_deprecated 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_type 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_console_session 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_deprecated 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_file_session 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_with_kernel_id 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_delete 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_kernel_id 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_kernel_name 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_path 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_path_deprecated 124s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_type 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal_via_get 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal_with_name 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_no_terminals 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_terminal_handler 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_terminal_root_handler 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalCullingTest::test_config 124s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalCullingTest::test_culling 124s ERROR notebook/tests/test_files.py::FilesTest::test_contents_manager - Runtim... 124s ERROR notebook/tests/test_files.py::FilesTest::test_download - RuntimeError: ... 124s ERROR notebook/tests/test_files.py::FilesTest::test_hidden_files - RuntimeErr... 124s ERROR notebook/tests/test_files.py::FilesTest::test_old_files_redirect - Runt... 124s ERROR notebook/tests/test_files.py::FilesTest::test_view_html - RuntimeError:... 124s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_class_mappings 124s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_get_kernelspecs 124s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_get_named_kernelspec 124s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_kernel_lifecycle 124s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_options - Run... 124s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_session_lifecycle 124s ERROR notebook/tests/test_notebookapp.py::NotebookAppTests::test_list_running_servers 124s ERROR notebook/tests/test_notebookapp.py::NotebookAppTests::test_log_json_default 124s ERROR notebook/tests/test_notebookapp.py::NotebookAppTests::test_validate_log_json 124s ERROR notebook/tests/test_notebookapp.py::NotebookUnixSocketTests::test_list_running_sock_servers 124s ERROR notebook/tests/test_notebookapp.py::NotebookUnixSocketTests::test_run 124s ERROR notebook/tests/test_notebookapp.py::NotebookAppJSONLoggingTests::test_log_json_enabled 124s ERROR notebook/tests/test_notebookapp.py::NotebookAppJSONLoggingTests::test_validate_log_json 124s ERROR notebook/tests/test_paths.py::RedirectTestCase::test_trailing_slash - R... 124s ERROR notebook/tree/tests/test_tree_handler.py::TreeTest::test_redirect - Run... 124s = 22 failed, 123 passed, 20 skipped, 5 deselected, 608 warnings, 160 errors in 26.55s = 124s autopkgtest [11:53:32]: test pytest: -----------------------] 124s pytest FAIL non-zero exit status 1 124s autopkgtest [11:53:32]: test pytest: - - - - - - - - - - results - - - - - - - - - - 124s autopkgtest [11:53:32]: test command1: preparing testbed 174s autopkgtest [11:54:22]: testbed dpkg architecture: amd64 175s autopkgtest [11:54:23]: testbed apt version: 2.9.3 175s autopkgtest [11:54:23]: @@@@@@@@@@@@@@@@@@@@ test bed setup 175s Get:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease [110 kB] 175s Get:2 http://ftpmaster.internal/ubuntu oracular-proposed/restricted Sources [7052 B] 175s Get:3 http://ftpmaster.internal/ubuntu oracular-proposed/main Sources [36.1 kB] 175s Get:4 http://ftpmaster.internal/ubuntu oracular-proposed/universe Sources [389 kB] 175s Get:5 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse Sources [2576 B] 175s Get:6 http://ftpmaster.internal/ubuntu oracular-proposed/main i386 Packages [38.2 kB] 175s Get:7 http://ftpmaster.internal/ubuntu oracular-proposed/main amd64 Packages [53.8 kB] 175s Get:8 http://ftpmaster.internal/ubuntu oracular-proposed/restricted amd64 Packages [28.9 kB] 175s Get:9 http://ftpmaster.internal/ubuntu oracular-proposed/restricted i386 Packages [6732 B] 175s Get:10 http://ftpmaster.internal/ubuntu oracular-proposed/universe amd64 Packages [316 kB] 175s Get:11 http://ftpmaster.internal/ubuntu oracular-proposed/universe i386 Packages [137 kB] 175s Get:12 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse i386 Packages [3884 B] 175s Get:13 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse amd64 Packages [8364 B] 175s Fetched 1138 kB in 0s (5019 kB/s) 175s Reading package lists... 176s Reading package lists... 177s Building dependency tree... 177s Reading state information... 177s Calculating upgrade... 177s The following packages will be upgraded: 177s apt apt-utils libapt-pkg6.0t64 libldap-common libldap2 177s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 177s Need to get 2875 kB of archives. 177s After this operation, 11.3 kB of additional disk space will be used. 177s Get:1 http://ftpmaster.internal/ubuntu oracular/main amd64 libapt-pkg6.0t64 amd64 2.9.5 [1023 kB] 177s Get:2 http://ftpmaster.internal/ubuntu oracular/main amd64 apt amd64 2.9.5 [1403 kB] 177s Get:3 http://ftpmaster.internal/ubuntu oracular/main amd64 apt-utils amd64 2.9.5 [223 kB] 177s Get:4 http://ftpmaster.internal/ubuntu oracular/main amd64 libldap-common all 2.6.7+dfsg-1~exp1ubuntu9 [31.5 kB] 177s Get:5 http://ftpmaster.internal/ubuntu oracular/main amd64 libldap2 amd64 2.6.7+dfsg-1~exp1ubuntu9 [195 kB] 177s Fetched 2875 kB in 0s (59.0 MB/s) 178s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 178s Preparing to unpack .../libapt-pkg6.0t64_2.9.5_amd64.deb ... 178s Unpacking libapt-pkg6.0t64:amd64 (2.9.5) over (2.9.3) ... 178s Setting up libapt-pkg6.0t64:amd64 (2.9.5) ... 178s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 178s Preparing to unpack .../archives/apt_2.9.5_amd64.deb ... 178s Unpacking apt (2.9.5) over (2.9.3) ... 178s Setting up apt (2.9.5) ... 179s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 179s Preparing to unpack .../apt-utils_2.9.5_amd64.deb ... 179s Unpacking apt-utils (2.9.5) over (2.9.3) ... 179s Preparing to unpack .../libldap-common_2.6.7+dfsg-1~exp1ubuntu9_all.deb ... 179s Unpacking libldap-common (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 179s Preparing to unpack .../libldap2_2.6.7+dfsg-1~exp1ubuntu9_amd64.deb ... 179s Unpacking libldap2:amd64 (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 179s Setting up apt-utils (2.9.5) ... 179s Setting up libldap-common (2.6.7+dfsg-1~exp1ubuntu9) ... 179s Setting up libldap2:amd64 (2.6.7+dfsg-1~exp1ubuntu9) ... 179s Processing triggers for man-db (2.12.1-2) ... 180s Processing triggers for libc-bin (2.39-0ubuntu9) ... 182s Reading package lists... 182s Building dependency tree... 182s Reading state information... 182s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 182s Hit:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease 182s Hit:2 http://ftpmaster.internal/ubuntu oracular InRelease 182s Hit:3 http://ftpmaster.internal/ubuntu oracular-updates InRelease 182s Hit:4 http://ftpmaster.internal/ubuntu oracular-security InRelease 183s Reading package lists... 183s Reading package lists... 183s Building dependency tree... 183s Reading state information... 184s Calculating upgrade... 184s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 184s Reading package lists... 184s Building dependency tree... 184s Reading state information... 184s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 185s autopkgtest [11:54:33]: rebooting testbed after setup commands that affected boot 188s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 198s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 203s Reading package lists... 204s Building dependency tree... 204s Reading state information... 204s Starting pkgProblemResolver with broken count: 0 204s Starting 2 pkgProblemResolver with broken count: 0 204s Done 204s The following additional packages will be installed: 204s fonts-font-awesome fonts-glyphicons-halflings fonts-lato fonts-mathjax gdb 204s jupyter-core jupyter-notebook libbabeltrace1 libdebuginfod-common 204s libdebuginfod1t64 libipt2 libjs-backbone libjs-bootstrap 204s libjs-bootstrap-tour libjs-codemirror libjs-es6-promise libjs-jed 204s libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 204s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 204s libjs-sphinxdoc libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 204s libpgm-5.3-0t64 libpython3.12t64 libsodium23 libsource-highlight-common 204s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 204s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 204s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 204s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 204s python3-executing python3-fastjsonschema python3-html5lib python3-ipykernel 204s python3-ipython python3-ipython-genutils python3-jedi python3-jupyter-client 204s python3-jupyter-core python3-jupyterlab-pygments python3-matplotlib-inline 204s python3-mistune python3-nbclient python3-nbconvert python3-nbformat 204s python3-nest-asyncio python3-notebook python3-packaging 204s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 204s python3-prometheus-client python3-prompt-toolkit python3-psutil 204s python3-ptyprocess python3-pure-eval python3-py python3-pydevd 204s python3-send2trash python3-soupsieve python3-stack-data python3-terminado 204s python3-tinycss2 python3-tornado python3-traitlets python3-typeshed 204s python3-wcwidth python3-webencodings python3-zmq sphinx-rtd-theme-common 204s Suggested packages: 204s gdb-doc gdbserver libjs-jquery-lazyload libjs-json libjs-jquery-ui-docs 204s fonts-mathjax-extras fonts-stix libjs-mathjax-doc python-argon2-doc 204s python-bleach-doc python-bytecode-doc python-coverage-doc 204s python-fastjsonschema-doc python3-genshi python3-lxml python-ipython-doc 204s python3-pip python-nbconvert-doc texlive-fonts-recommended 204s texlive-plain-generic texlive-xetex python-pexpect-doc subversion 204s python3-pytest pydevd python-terminado-doc python-tinycss2-doc 204s python3-pycurl python-tornado-doc python3-twisted 204s Recommended packages: 204s libc-dbg javascript-common python3-lxml python3-matplotlib pandoc 204s python3-ipywidgets 205s The following NEW packages will be installed: 205s autopkgtest-satdep fonts-font-awesome fonts-glyphicons-halflings fonts-lato 205s fonts-mathjax gdb jupyter-core jupyter-notebook libbabeltrace1 205s libdebuginfod-common libdebuginfod1t64 libipt2 libjs-backbone 205s libjs-bootstrap libjs-bootstrap-tour libjs-codemirror libjs-es6-promise 205s libjs-jed libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 205s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 205s libjs-sphinxdoc libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 205s libpgm-5.3-0t64 libpython3.12t64 libsodium23 libsource-highlight-common 205s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 205s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 205s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 205s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 205s python3-executing python3-fastjsonschema python3-html5lib python3-ipykernel 205s python3-ipython python3-ipython-genutils python3-jedi python3-jupyter-client 205s python3-jupyter-core python3-jupyterlab-pygments python3-matplotlib-inline 205s python3-mistune python3-nbclient python3-nbconvert python3-nbformat 205s python3-nest-asyncio python3-notebook python3-packaging 205s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 205s python3-prometheus-client python3-prompt-toolkit python3-psutil 205s python3-ptyprocess python3-pure-eval python3-py python3-pydevd 205s python3-send2trash python3-soupsieve python3-stack-data python3-terminado 205s python3-tinycss2 python3-tornado python3-traitlets python3-typeshed 205s python3-wcwidth python3-webencodings python3-zmq sphinx-rtd-theme-common 205s 0 upgraded, 93 newly installed, 0 to remove and 0 not upgraded. 205s Need to get 33.1 MB/33.1 MB of archives. 205s After this operation, 169 MB of additional disk space will be used. 205s Get:1 /tmp/autopkgtest.iPdHX2/2-autopkgtest-satdep.deb autopkgtest-satdep amd64 0 [728 B] 205s Get:2 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-lato all 2.015-1 [2781 kB] 205s Get:3 http://ftpmaster.internal/ubuntu oracular/main amd64 libdebuginfod-common all 0.191-1 [14.6 kB] 205s Get:4 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 205s Get:5 http://ftpmaster.internal/ubuntu oracular/universe amd64 fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-3 [118 kB] 205s Get:6 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-mathjax all 2.7.9+dfsg-1 [2208 kB] 205s Get:7 http://ftpmaster.internal/ubuntu oracular/main amd64 libbabeltrace1 amd64 1.5.11-3build3 [164 kB] 205s Get:8 http://ftpmaster.internal/ubuntu oracular/main amd64 libdebuginfod1t64 amd64 0.191-1 [17.1 kB] 205s Get:9 http://ftpmaster.internal/ubuntu oracular/main amd64 libipt2 amd64 2.0.6-1build1 [45.7 kB] 205s Get:10 http://ftpmaster.internal/ubuntu oracular/main amd64 libpython3.12t64 amd64 3.12.4-1 [2338 kB] 205s Get:11 http://ftpmaster.internal/ubuntu oracular/main amd64 libsource-highlight-common all 3.1.9-4.3build1 [64.2 kB] 205s Get:12 http://ftpmaster.internal/ubuntu oracular/main amd64 libsource-highlight4t64 amd64 3.1.9-4.3build1 [258 kB] 205s Get:13 http://ftpmaster.internal/ubuntu oracular/main amd64 gdb amd64 15.0.50.20240403-0ubuntu1 [4010 kB] 205s Get:14 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-platformdirs all 4.2.1-1 [16.3 kB] 205s Get:15 http://ftpmaster.internal/ubuntu oracular-proposed/universe amd64 python3-traitlets all 5.14.3-1 [71.3 kB] 205s Get:16 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyter-core all 5.3.2-2 [25.5 kB] 205s Get:17 http://ftpmaster.internal/ubuntu oracular/universe amd64 jupyter-core all 5.3.2-2 [4038 B] 205s Get:18 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 205s Get:19 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-backbone all 1.4.1~dfsg+~1.4.15-3 [185 kB] 205s Get:20 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-bootstrap all 3.4.1+dfsg-3 [129 kB] 205s Get:21 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 205s Get:22 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-bootstrap-tour all 0.12.0+dfsg-5 [21.4 kB] 205s Get:23 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-codemirror all 5.65.0+~cs5.83.9-3 [755 kB] 205s Get:24 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-es6-promise all 4.2.8-12 [14.1 kB] 205s Get:25 http://ftpmaster.internal/ubuntu oracular/universe amd64 node-jed all 1.1.1-4 [15.2 kB] 205s Get:26 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jed all 1.1.1-4 [2584 B] 205s Get:27 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jquery-typeahead all 2.11.0+dfsg1-3 [48.9 kB] 205s Get:28 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jquery-ui all 1.13.2+dfsg-1 [252 kB] 205s Get:29 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-marked all 4.2.3+ds+~4.0.7-3 [36.2 kB] 205s Get:30 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-mathjax all 2.7.9+dfsg-1 [5665 kB] 205s Get:31 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-moment all 2.29.4+ds-1 [147 kB] 205s Get:32 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-requirejs all 2.3.6+ds+~2.1.37-1 [201 kB] 205s Get:33 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-requirejs-text all 2.0.12-1.1 [9056 B] 205s Get:34 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-text-encoding all 0.7.0-5 [140 kB] 205s Get:35 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-xterm all 5.3.0-2 [476 kB] 205s Get:36 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-ptyprocess all 0.7.0-5 [15.1 kB] 205s Get:37 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-tornado amd64 6.4.1-1 [298 kB] 205s Get:38 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-terminado all 0.18.1-1 [13.2 kB] 205s Get:39 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-argon2 amd64 21.1.0-2build1 [21.0 kB] 205s Get:40 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-comm all 0.2.1-1 [7016 B] 205s Get:41 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-bytecode all 0.15.1-3 [44.7 kB] 205s Get:42 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-coverage amd64 7.4.4+dfsg1-0ubuntu2 [147 kB] 205s Get:43 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pydevd amd64 2.10.0+ds-10ubuntu1 [637 kB] 205s Get:44 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-debugpy all 1.8.0+ds-4ubuntu4 [67.6 kB] 205s Get:45 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-decorator all 5.1.1-5 [10.1 kB] 205s Get:46 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-parso all 0.8.3-1 [67.2 kB] 205s Get:47 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-typeshed all 0.0~git20231111.6764465-3 [1274 kB] 205s Get:48 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jedi all 0.19.1+ds1-1 [693 kB] 205s Get:49 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-matplotlib-inline all 0.1.6-2 [8784 B] 205s Get:50 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-pexpect all 4.9-2 [48.1 kB] 205s Get:51 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-wcwidth all 0.2.5+dfsg1-1.1ubuntu1 [22.5 kB] 205s Get:52 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-prompt-toolkit all 3.0.46-1 [256 kB] 205s Get:53 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-asttokens all 2.4.1-1 [20.9 kB] 205s Get:54 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-executing all 2.0.1-0.1 [23.3 kB] 205s Get:55 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pure-eval all 0.2.2-2 [11.1 kB] 205s Get:56 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-stack-data all 0.6.3-1 [22.0 kB] 205s Get:57 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipython all 8.20.0-1ubuntu1 [561 kB] 205s Get:58 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-dateutil all 2.9.0-2 [80.3 kB] 205s Get:59 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-entrypoints all 0.4-2 [7146 B] 205s Get:60 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nest-asyncio all 1.5.4-1 [6256 B] 205s Get:61 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-py all 1.11.0-2 [72.7 kB] 205s Get:62 http://ftpmaster.internal/ubuntu oracular/universe amd64 libnorm1t64 amd64 1.5.9+dfsg-3.1build1 [154 kB] 205s Get:63 http://ftpmaster.internal/ubuntu oracular/universe amd64 libpgm-5.3-0t64 amd64 5.3.128~dfsg-2.1build1 [167 kB] 205s Get:64 http://ftpmaster.internal/ubuntu oracular/main amd64 libsodium23 amd64 1.0.18-1build3 [161 kB] 205s Get:65 http://ftpmaster.internal/ubuntu oracular/universe amd64 libzmq5 amd64 4.3.5-1build2 [260 kB] 205s Get:66 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-zmq amd64 24.0.1-5build1 [286 kB] 205s Get:67 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyter-client all 7.4.9-2ubuntu1 [90.5 kB] 205s Get:68 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-packaging all 24.0-1 [41.1 kB] 205s Get:69 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-psutil amd64 5.9.8-2build2 [195 kB] 205s Get:70 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipykernel all 6.29.3-1ubuntu1 [82.6 kB] 205s Get:71 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipython-genutils all 0.2.0-6 [22.0 kB] 205s Get:72 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-webencodings all 0.5.1-5 [11.5 kB] 205s Get:73 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-html5lib all 1.1-6 [88.8 kB] 205s Get:74 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-bleach all 6.1.0-2 [49.6 kB] 205s Get:75 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-soupsieve all 2.5-1 [33.0 kB] 205s Get:76 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-bs4 all 4.12.3-1 [109 kB] 205s Get:77 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-defusedxml all 0.7.1-2 [42.0 kB] 205s Get:78 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyterlab-pygments all 0.2.2-3 [6054 B] 205s Get:79 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-mistune all 3.0.2-1 [32.8 kB] 205s Get:80 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-fastjsonschema all 2.19.1-1 [19.7 kB] 205s Get:81 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbformat all 5.9.1-1 [41.2 kB] 205s Get:82 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbclient all 0.8.0-1 [55.6 kB] 205s Get:83 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pandocfilters all 1.5.1-1 [23.6 kB] 205s Get:84 http://ftpmaster.internal/ubuntu oracular/universe amd64 python-tinycss2-common all 1.3.0-1 [34.1 kB] 205s Get:85 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-tinycss2 all 1.3.0-1 [19.6 kB] 205s Get:86 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbconvert all 7.16.4-1 [156 kB] 205s Get:87 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-prometheus-client all 0.19.0+ds1-1 [41.7 kB] 205s Get:88 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-send2trash all 1.8.2-1 [15.5 kB] 205s Get:89 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-notebook all 6.4.12-2.2ubuntu1 [1566 kB] 205s Get:90 http://ftpmaster.internal/ubuntu oracular/universe amd64 jupyter-notebook all 6.4.12-2.2ubuntu1 [10.4 kB] 205s Get:91 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-sphinxdoc all 7.2.6-8 [150 kB] 205s Get:92 http://ftpmaster.internal/ubuntu oracular/main amd64 sphinx-rtd-theme-common all 2.0.0+dfsg-1 [1012 kB] 205s Get:93 http://ftpmaster.internal/ubuntu oracular/universe amd64 python-notebook-doc all 6.4.12-2.2ubuntu1 [2540 kB] 205s Preconfiguring packages ... 205s Fetched 33.1 MB in 0s (112 MB/s) 205s Selecting previously unselected package fonts-lato. 206s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 206s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 206s Unpacking fonts-lato (2.015-1) ... 206s Selecting previously unselected package libdebuginfod-common. 206s Preparing to unpack .../01-libdebuginfod-common_0.191-1_all.deb ... 206s Unpacking libdebuginfod-common (0.191-1) ... 206s Selecting previously unselected package fonts-font-awesome. 206s Preparing to unpack .../02-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 206s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 206s Selecting previously unselected package fonts-glyphicons-halflings. 206s Preparing to unpack .../03-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-3_all.deb ... 206s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 206s Selecting previously unselected package fonts-mathjax. 206s Preparing to unpack .../04-fonts-mathjax_2.7.9+dfsg-1_all.deb ... 206s Unpacking fonts-mathjax (2.7.9+dfsg-1) ... 206s Selecting previously unselected package libbabeltrace1:amd64. 206s Preparing to unpack .../05-libbabeltrace1_1.5.11-3build3_amd64.deb ... 206s Unpacking libbabeltrace1:amd64 (1.5.11-3build3) ... 206s Selecting previously unselected package libdebuginfod1t64:amd64. 206s Preparing to unpack .../06-libdebuginfod1t64_0.191-1_amd64.deb ... 206s Unpacking libdebuginfod1t64:amd64 (0.191-1) ... 206s Selecting previously unselected package libipt2. 206s Preparing to unpack .../07-libipt2_2.0.6-1build1_amd64.deb ... 206s Unpacking libipt2 (2.0.6-1build1) ... 206s Selecting previously unselected package libpython3.12t64:amd64. 206s Preparing to unpack .../08-libpython3.12t64_3.12.4-1_amd64.deb ... 206s Unpacking libpython3.12t64:amd64 (3.12.4-1) ... 206s Selecting previously unselected package libsource-highlight-common. 206s Preparing to unpack .../09-libsource-highlight-common_3.1.9-4.3build1_all.deb ... 206s Unpacking libsource-highlight-common (3.1.9-4.3build1) ... 206s Selecting previously unselected package libsource-highlight4t64:amd64. 206s Preparing to unpack .../10-libsource-highlight4t64_3.1.9-4.3build1_amd64.deb ... 206s Unpacking libsource-highlight4t64:amd64 (3.1.9-4.3build1) ... 206s Selecting previously unselected package gdb. 206s Preparing to unpack .../11-gdb_15.0.50.20240403-0ubuntu1_amd64.deb ... 206s Unpacking gdb (15.0.50.20240403-0ubuntu1) ... 207s Selecting previously unselected package python3-platformdirs. 207s Preparing to unpack .../12-python3-platformdirs_4.2.1-1_all.deb ... 207s Unpacking python3-platformdirs (4.2.1-1) ... 207s Selecting previously unselected package python3-traitlets. 207s Preparing to unpack .../13-python3-traitlets_5.14.3-1_all.deb ... 207s Unpacking python3-traitlets (5.14.3-1) ... 207s Selecting previously unselected package python3-jupyter-core. 207s Preparing to unpack .../14-python3-jupyter-core_5.3.2-2_all.deb ... 207s Unpacking python3-jupyter-core (5.3.2-2) ... 207s Selecting previously unselected package jupyter-core. 207s Preparing to unpack .../15-jupyter-core_5.3.2-2_all.deb ... 207s Unpacking jupyter-core (5.3.2-2) ... 207s Selecting previously unselected package libjs-underscore. 207s Preparing to unpack .../16-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 207s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 207s Selecting previously unselected package libjs-backbone. 207s Preparing to unpack .../17-libjs-backbone_1.4.1~dfsg+~1.4.15-3_all.deb ... 207s Unpacking libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 207s Selecting previously unselected package libjs-bootstrap. 207s Preparing to unpack .../18-libjs-bootstrap_3.4.1+dfsg-3_all.deb ... 207s Unpacking libjs-bootstrap (3.4.1+dfsg-3) ... 207s Selecting previously unselected package libjs-jquery. 207s Preparing to unpack .../19-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 207s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 207s Selecting previously unselected package libjs-bootstrap-tour. 207s Preparing to unpack .../20-libjs-bootstrap-tour_0.12.0+dfsg-5_all.deb ... 207s Unpacking libjs-bootstrap-tour (0.12.0+dfsg-5) ... 207s Selecting previously unselected package libjs-codemirror. 207s Preparing to unpack .../21-libjs-codemirror_5.65.0+~cs5.83.9-3_all.deb ... 207s Unpacking libjs-codemirror (5.65.0+~cs5.83.9-3) ... 207s Selecting previously unselected package libjs-es6-promise. 207s Preparing to unpack .../22-libjs-es6-promise_4.2.8-12_all.deb ... 207s Unpacking libjs-es6-promise (4.2.8-12) ... 207s Selecting previously unselected package node-jed. 207s Preparing to unpack .../23-node-jed_1.1.1-4_all.deb ... 207s Unpacking node-jed (1.1.1-4) ... 207s Selecting previously unselected package libjs-jed. 207s Preparing to unpack .../24-libjs-jed_1.1.1-4_all.deb ... 207s Unpacking libjs-jed (1.1.1-4) ... 207s Selecting previously unselected package libjs-jquery-typeahead. 207s Preparing to unpack .../25-libjs-jquery-typeahead_2.11.0+dfsg1-3_all.deb ... 207s Unpacking libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 207s Selecting previously unselected package libjs-jquery-ui. 207s Preparing to unpack .../26-libjs-jquery-ui_1.13.2+dfsg-1_all.deb ... 207s Unpacking libjs-jquery-ui (1.13.2+dfsg-1) ... 207s Selecting previously unselected package libjs-marked. 207s Preparing to unpack .../27-libjs-marked_4.2.3+ds+~4.0.7-3_all.deb ... 207s Unpacking libjs-marked (4.2.3+ds+~4.0.7-3) ... 207s Selecting previously unselected package libjs-mathjax. 207s Preparing to unpack .../28-libjs-mathjax_2.7.9+dfsg-1_all.deb ... 207s Unpacking libjs-mathjax (2.7.9+dfsg-1) ... 208s Selecting previously unselected package libjs-moment. 208s Preparing to unpack .../29-libjs-moment_2.29.4+ds-1_all.deb ... 208s Unpacking libjs-moment (2.29.4+ds-1) ... 208s Selecting previously unselected package libjs-requirejs. 208s Preparing to unpack .../30-libjs-requirejs_2.3.6+ds+~2.1.37-1_all.deb ... 208s Unpacking libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 208s Selecting previously unselected package libjs-requirejs-text. 208s Preparing to unpack .../31-libjs-requirejs-text_2.0.12-1.1_all.deb ... 208s Unpacking libjs-requirejs-text (2.0.12-1.1) ... 208s Selecting previously unselected package libjs-text-encoding. 208s Preparing to unpack .../32-libjs-text-encoding_0.7.0-5_all.deb ... 208s Unpacking libjs-text-encoding (0.7.0-5) ... 208s Selecting previously unselected package libjs-xterm. 208s Preparing to unpack .../33-libjs-xterm_5.3.0-2_all.deb ... 208s Unpacking libjs-xterm (5.3.0-2) ... 208s Selecting previously unselected package python3-ptyprocess. 208s Preparing to unpack .../34-python3-ptyprocess_0.7.0-5_all.deb ... 208s Unpacking python3-ptyprocess (0.7.0-5) ... 208s Selecting previously unselected package python3-tornado. 208s Preparing to unpack .../35-python3-tornado_6.4.1-1_amd64.deb ... 208s Unpacking python3-tornado (6.4.1-1) ... 208s Selecting previously unselected package python3-terminado. 208s Preparing to unpack .../36-python3-terminado_0.18.1-1_all.deb ... 208s Unpacking python3-terminado (0.18.1-1) ... 208s Selecting previously unselected package python3-argon2. 208s Preparing to unpack .../37-python3-argon2_21.1.0-2build1_amd64.deb ... 208s Unpacking python3-argon2 (21.1.0-2build1) ... 208s Selecting previously unselected package python3-comm. 208s Preparing to unpack .../38-python3-comm_0.2.1-1_all.deb ... 208s Unpacking python3-comm (0.2.1-1) ... 208s Selecting previously unselected package python3-bytecode. 208s Preparing to unpack .../39-python3-bytecode_0.15.1-3_all.deb ... 208s Unpacking python3-bytecode (0.15.1-3) ... 208s Selecting previously unselected package python3-coverage. 208s Preparing to unpack .../40-python3-coverage_7.4.4+dfsg1-0ubuntu2_amd64.deb ... 208s Unpacking python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 208s Selecting previously unselected package python3-pydevd. 208s Preparing to unpack .../41-python3-pydevd_2.10.0+ds-10ubuntu1_amd64.deb ... 208s Unpacking python3-pydevd (2.10.0+ds-10ubuntu1) ... 208s Selecting previously unselected package python3-debugpy. 208s Preparing to unpack .../42-python3-debugpy_1.8.0+ds-4ubuntu4_all.deb ... 208s Unpacking python3-debugpy (1.8.0+ds-4ubuntu4) ... 208s Selecting previously unselected package python3-decorator. 208s Preparing to unpack .../43-python3-decorator_5.1.1-5_all.deb ... 208s Unpacking python3-decorator (5.1.1-5) ... 208s Selecting previously unselected package python3-parso. 208s Preparing to unpack .../44-python3-parso_0.8.3-1_all.deb ... 208s Unpacking python3-parso (0.8.3-1) ... 208s Selecting previously unselected package python3-typeshed. 208s Preparing to unpack .../45-python3-typeshed_0.0~git20231111.6764465-3_all.deb ... 208s Unpacking python3-typeshed (0.0~git20231111.6764465-3) ... 209s Selecting previously unselected package python3-jedi. 209s Preparing to unpack .../46-python3-jedi_0.19.1+ds1-1_all.deb ... 209s Unpacking python3-jedi (0.19.1+ds1-1) ... 209s Selecting previously unselected package python3-matplotlib-inline. 209s Preparing to unpack .../47-python3-matplotlib-inline_0.1.6-2_all.deb ... 209s Unpacking python3-matplotlib-inline (0.1.6-2) ... 209s Selecting previously unselected package python3-pexpect. 209s Preparing to unpack .../48-python3-pexpect_4.9-2_all.deb ... 209s Unpacking python3-pexpect (4.9-2) ... 209s Selecting previously unselected package python3-wcwidth. 209s Preparing to unpack .../49-python3-wcwidth_0.2.5+dfsg1-1.1ubuntu1_all.deb ... 209s Unpacking python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 209s Selecting previously unselected package python3-prompt-toolkit. 209s Preparing to unpack .../50-python3-prompt-toolkit_3.0.46-1_all.deb ... 209s Unpacking python3-prompt-toolkit (3.0.46-1) ... 209s Selecting previously unselected package python3-asttokens. 209s Preparing to unpack .../51-python3-asttokens_2.4.1-1_all.deb ... 209s Unpacking python3-asttokens (2.4.1-1) ... 209s Selecting previously unselected package python3-executing. 209s Preparing to unpack .../52-python3-executing_2.0.1-0.1_all.deb ... 209s Unpacking python3-executing (2.0.1-0.1) ... 209s Selecting previously unselected package python3-pure-eval. 209s Preparing to unpack .../53-python3-pure-eval_0.2.2-2_all.deb ... 209s Unpacking python3-pure-eval (0.2.2-2) ... 210s Selecting previously unselected package python3-stack-data. 210s Preparing to unpack .../54-python3-stack-data_0.6.3-1_all.deb ... 210s Unpacking python3-stack-data (0.6.3-1) ... 210s Selecting previously unselected package python3-ipython. 210s Preparing to unpack .../55-python3-ipython_8.20.0-1ubuntu1_all.deb ... 210s Unpacking python3-ipython (8.20.0-1ubuntu1) ... 210s Selecting previously unselected package python3-dateutil. 210s Preparing to unpack .../56-python3-dateutil_2.9.0-2_all.deb ... 210s Unpacking python3-dateutil (2.9.0-2) ... 210s Selecting previously unselected package python3-entrypoints. 210s Preparing to unpack .../57-python3-entrypoints_0.4-2_all.deb ... 210s Unpacking python3-entrypoints (0.4-2) ... 210s Selecting previously unselected package python3-nest-asyncio. 210s Preparing to unpack .../58-python3-nest-asyncio_1.5.4-1_all.deb ... 210s Unpacking python3-nest-asyncio (1.5.4-1) ... 210s Selecting previously unselected package python3-py. 210s Preparing to unpack .../59-python3-py_1.11.0-2_all.deb ... 210s Unpacking python3-py (1.11.0-2) ... 210s Selecting previously unselected package libnorm1t64:amd64. 210s Preparing to unpack .../60-libnorm1t64_1.5.9+dfsg-3.1build1_amd64.deb ... 210s Unpacking libnorm1t64:amd64 (1.5.9+dfsg-3.1build1) ... 210s Selecting previously unselected package libpgm-5.3-0t64:amd64. 210s Preparing to unpack .../61-libpgm-5.3-0t64_5.3.128~dfsg-2.1build1_amd64.deb ... 210s Unpacking libpgm-5.3-0t64:amd64 (5.3.128~dfsg-2.1build1) ... 210s Selecting previously unselected package libsodium23:amd64. 210s Preparing to unpack .../62-libsodium23_1.0.18-1build3_amd64.deb ... 210s Unpacking libsodium23:amd64 (1.0.18-1build3) ... 210s Selecting previously unselected package libzmq5:amd64. 210s Preparing to unpack .../63-libzmq5_4.3.5-1build2_amd64.deb ... 210s Unpacking libzmq5:amd64 (4.3.5-1build2) ... 210s Selecting previously unselected package python3-zmq. 210s Preparing to unpack .../64-python3-zmq_24.0.1-5build1_amd64.deb ... 210s Unpacking python3-zmq (24.0.1-5build1) ... 210s Selecting previously unselected package python3-jupyter-client. 210s Preparing to unpack .../65-python3-jupyter-client_7.4.9-2ubuntu1_all.deb ... 210s Unpacking python3-jupyter-client (7.4.9-2ubuntu1) ... 210s Selecting previously unselected package python3-packaging. 210s Preparing to unpack .../66-python3-packaging_24.0-1_all.deb ... 210s Unpacking python3-packaging (24.0-1) ... 210s Selecting previously unselected package python3-psutil. 210s Preparing to unpack .../67-python3-psutil_5.9.8-2build2_amd64.deb ... 210s Unpacking python3-psutil (5.9.8-2build2) ... 210s Selecting previously unselected package python3-ipykernel. 210s Preparing to unpack .../68-python3-ipykernel_6.29.3-1ubuntu1_all.deb ... 210s Unpacking python3-ipykernel (6.29.3-1ubuntu1) ... 210s Selecting previously unselected package python3-ipython-genutils. 210s Preparing to unpack .../69-python3-ipython-genutils_0.2.0-6_all.deb ... 210s Unpacking python3-ipython-genutils (0.2.0-6) ... 210s Selecting previously unselected package python3-webencodings. 210s Preparing to unpack .../70-python3-webencodings_0.5.1-5_all.deb ... 210s Unpacking python3-webencodings (0.5.1-5) ... 210s Selecting previously unselected package python3-html5lib. 210s Preparing to unpack .../71-python3-html5lib_1.1-6_all.deb ... 210s Unpacking python3-html5lib (1.1-6) ... 210s Selecting previously unselected package python3-bleach. 210s Preparing to unpack .../72-python3-bleach_6.1.0-2_all.deb ... 210s Unpacking python3-bleach (6.1.0-2) ... 210s Selecting previously unselected package python3-soupsieve. 210s Preparing to unpack .../73-python3-soupsieve_2.5-1_all.deb ... 210s Unpacking python3-soupsieve (2.5-1) ... 210s Selecting previously unselected package python3-bs4. 210s Preparing to unpack .../74-python3-bs4_4.12.3-1_all.deb ... 210s Unpacking python3-bs4 (4.12.3-1) ... 210s Selecting previously unselected package python3-defusedxml. 210s Preparing to unpack .../75-python3-defusedxml_0.7.1-2_all.deb ... 210s Unpacking python3-defusedxml (0.7.1-2) ... 210s Selecting previously unselected package python3-jupyterlab-pygments. 210s Preparing to unpack .../76-python3-jupyterlab-pygments_0.2.2-3_all.deb ... 210s Unpacking python3-jupyterlab-pygments (0.2.2-3) ... 210s Selecting previously unselected package python3-mistune. 210s Preparing to unpack .../77-python3-mistune_3.0.2-1_all.deb ... 210s Unpacking python3-mistune (3.0.2-1) ... 210s Selecting previously unselected package python3-fastjsonschema. 210s Preparing to unpack .../78-python3-fastjsonschema_2.19.1-1_all.deb ... 210s Unpacking python3-fastjsonschema (2.19.1-1) ... 210s Selecting previously unselected package python3-nbformat. 210s Preparing to unpack .../79-python3-nbformat_5.9.1-1_all.deb ... 210s Unpacking python3-nbformat (5.9.1-1) ... 210s Selecting previously unselected package python3-nbclient. 210s Preparing to unpack .../80-python3-nbclient_0.8.0-1_all.deb ... 210s Unpacking python3-nbclient (0.8.0-1) ... 210s Selecting previously unselected package python3-pandocfilters. 210s Preparing to unpack .../81-python3-pandocfilters_1.5.1-1_all.deb ... 210s Unpacking python3-pandocfilters (1.5.1-1) ... 210s Selecting previously unselected package python-tinycss2-common. 211s Preparing to unpack .../82-python-tinycss2-common_1.3.0-1_all.deb ... 211s Unpacking python-tinycss2-common (1.3.0-1) ... 211s Selecting previously unselected package python3-tinycss2. 211s Preparing to unpack .../83-python3-tinycss2_1.3.0-1_all.deb ... 211s Unpacking python3-tinycss2 (1.3.0-1) ... 211s Selecting previously unselected package python3-nbconvert. 211s Preparing to unpack .../84-python3-nbconvert_7.16.4-1_all.deb ... 211s Unpacking python3-nbconvert (7.16.4-1) ... 211s Selecting previously unselected package python3-prometheus-client. 211s Preparing to unpack .../85-python3-prometheus-client_0.19.0+ds1-1_all.deb ... 211s Unpacking python3-prometheus-client (0.19.0+ds1-1) ... 211s Selecting previously unselected package python3-send2trash. 211s Preparing to unpack .../86-python3-send2trash_1.8.2-1_all.deb ... 211s Unpacking python3-send2trash (1.8.2-1) ... 211s Selecting previously unselected package python3-notebook. 211s Preparing to unpack .../87-python3-notebook_6.4.12-2.2ubuntu1_all.deb ... 211s Unpacking python3-notebook (6.4.12-2.2ubuntu1) ... 211s Selecting previously unselected package jupyter-notebook. 211s Preparing to unpack .../88-jupyter-notebook_6.4.12-2.2ubuntu1_all.deb ... 211s Unpacking jupyter-notebook (6.4.12-2.2ubuntu1) ... 211s Selecting previously unselected package libjs-sphinxdoc. 211s Preparing to unpack .../89-libjs-sphinxdoc_7.2.6-8_all.deb ... 211s Unpacking libjs-sphinxdoc (7.2.6-8) ... 211s Selecting previously unselected package sphinx-rtd-theme-common. 211s Preparing to unpack .../90-sphinx-rtd-theme-common_2.0.0+dfsg-1_all.deb ... 211s Unpacking sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 211s Selecting previously unselected package python-notebook-doc. 211s Preparing to unpack .../91-python-notebook-doc_6.4.12-2.2ubuntu1_all.deb ... 211s Unpacking python-notebook-doc (6.4.12-2.2ubuntu1) ... 211s Selecting previously unselected package autopkgtest-satdep. 211s Preparing to unpack .../92-2-autopkgtest-satdep.deb ... 211s Unpacking autopkgtest-satdep (0) ... 211s Setting up python3-entrypoints (0.4-2) ... 211s Setting up libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 211s Setting up python3-tornado (6.4.1-1) ... 212s Setting up libnorm1t64:amd64 (1.5.9+dfsg-3.1build1) ... 212s Setting up python3-pure-eval (0.2.2-2) ... 212s Setting up python3-send2trash (1.8.2-1) ... 212s Setting up fonts-lato (2.015-1) ... 212s Setting up fonts-mathjax (2.7.9+dfsg-1) ... 212s Setting up libsodium23:amd64 (1.0.18-1build3) ... 212s Setting up libjs-mathjax (2.7.9+dfsg-1) ... 212s Setting up python3-py (1.11.0-2) ... 212s Setting up libdebuginfod-common (0.191-1) ... 212s Setting up libjs-requirejs-text (2.0.12-1.1) ... 212s Setting up python3-parso (0.8.3-1) ... 212s Setting up python3-defusedxml (0.7.1-2) ... 212s Setting up python3-ipython-genutils (0.2.0-6) ... 212s Setting up python3-asttokens (2.4.1-1) ... 213s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 213s Setting up python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 213s Setting up libjs-moment (2.29.4+ds-1) ... 213s Setting up python3-pandocfilters (1.5.1-1) ... 213s Setting up libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 213s Setting up libjs-es6-promise (4.2.8-12) ... 213s Setting up libjs-text-encoding (0.7.0-5) ... 213s Setting up python3-webencodings (0.5.1-5) ... 213s Setting up python3-platformdirs (4.2.1-1) ... 213s Setting up python3-psutil (5.9.8-2build2) ... 213s Setting up libsource-highlight-common (3.1.9-4.3build1) ... 213s Setting up python3-jupyterlab-pygments (0.2.2-3) ... 213s Setting up libpython3.12t64:amd64 (3.12.4-1) ... 213s Setting up libpgm-5.3-0t64:amd64 (5.3.128~dfsg-2.1build1) ... 213s Setting up python3-decorator (5.1.1-5) ... 214s Setting up python3-packaging (24.0-1) ... 214s Setting up python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 214s Setting up node-jed (1.1.1-4) ... 214s Setting up python3-typeshed (0.0~git20231111.6764465-3) ... 214s Setting up python3-executing (2.0.1-0.1) ... 214s Setting up libjs-xterm (5.3.0-2) ... 214s Setting up python3-nest-asyncio (1.5.4-1) ... 214s Setting up python3-bytecode (0.15.1-3) ... 214s Setting up libjs-codemirror (5.65.0+~cs5.83.9-3) ... 214s Setting up libjs-jed (1.1.1-4) ... 214s Setting up libipt2 (2.0.6-1build1) ... 214s Setting up python3-html5lib (1.1-6) ... 214s Setting up libbabeltrace1:amd64 (1.5.11-3build3) ... 214s Setting up python3-fastjsonschema (2.19.1-1) ... 215s Setting up python3-traitlets (5.14.3-1) ... 215s Setting up python-tinycss2-common (1.3.0-1) ... 215s Setting up python3-argon2 (21.1.0-2build1) ... 215s Setting up python3-dateutil (2.9.0-2) ... 215s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 215s Setting up python3-mistune (3.0.2-1) ... 215s Setting up python3-stack-data (0.6.3-1) ... 215s Setting up python3-soupsieve (2.5-1) ... 215s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 215s Setting up sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 215s Setting up python3-jupyter-core (5.3.2-2) ... 216s Setting up libjs-bootstrap (3.4.1+dfsg-3) ... 216s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 216s Setting up python3-ptyprocess (0.7.0-5) ... 216s Setting up libjs-marked (4.2.3+ds+~4.0.7-3) ... 216s Setting up python3-prompt-toolkit (3.0.46-1) ... 216s Setting up libdebuginfod1t64:amd64 (0.191-1) ... 216s Setting up python3-tinycss2 (1.3.0-1) ... 216s Setting up libzmq5:amd64 (4.3.5-1build2) ... 216s Setting up python3-jedi (0.19.1+ds1-1) ... 216s Setting up libjs-bootstrap-tour (0.12.0+dfsg-5) ... 216s Setting up libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 216s Setting up libsource-highlight4t64:amd64 (3.1.9-4.3build1) ... 216s Setting up python3-nbformat (5.9.1-1) ... 217s Setting up python3-bs4 (4.12.3-1) ... 217s Setting up python3-bleach (6.1.0-2) ... 217s Setting up python3-matplotlib-inline (0.1.6-2) ... 217s Setting up python3-comm (0.2.1-1) ... 217s Setting up python3-prometheus-client (0.19.0+ds1-1) ... 217s Setting up gdb (15.0.50.20240403-0ubuntu1) ... 217s Setting up libjs-jquery-ui (1.13.2+dfsg-1) ... 217s Setting up python3-pexpect (4.9-2) ... 217s Setting up python3-zmq (24.0.1-5build1) ... 218s Setting up libjs-sphinxdoc (7.2.6-8) ... 218s Setting up python3-terminado (0.18.1-1) ... 218s Setting up python3-jupyter-client (7.4.9-2ubuntu1) ... 218s Setting up jupyter-core (5.3.2-2) ... 218s Setting up python3-pydevd (2.10.0+ds-10ubuntu1) ... 219s Setting up python3-debugpy (1.8.0+ds-4ubuntu4) ... 219s Setting up python-notebook-doc (6.4.12-2.2ubuntu1) ... 219s Setting up python3-nbclient (0.8.0-1) ... 219s Setting up python3-ipython (8.20.0-1ubuntu1) ... 219s Setting up python3-ipykernel (6.29.3-1ubuntu1) ... 220s Setting up python3-nbconvert (7.16.4-1) ... 220s Setting up python3-notebook (6.4.12-2.2ubuntu1) ... 220s Setting up jupyter-notebook (6.4.12-2.2ubuntu1) ... 220s Setting up autopkgtest-satdep (0) ... 220s Processing triggers for man-db (2.12.1-2) ... 221s Processing triggers for libc-bin (2.39-0ubuntu9) ... 224s (Reading database ... 90898 files and directories currently installed.) 224s Removing autopkgtest-satdep (0) ... 225s autopkgtest [11:55:13]: test command1: find /usr/lib/python3/dist-packages/notebook -xtype l >&2 225s autopkgtest [11:55:13]: test command1: [----------------------- 225s autopkgtest [11:55:13]: test command1: -----------------------] 225s command1 PASS (superficial) 225s autopkgtest [11:55:13]: test command1: - - - - - - - - - - results - - - - - - - - - - 225s autopkgtest [11:55:13]: test autodep8-python3: preparing testbed 280s autopkgtest [11:56:08]: testbed dpkg architecture: amd64 280s autopkgtest [11:56:08]: testbed apt version: 2.9.3 280s autopkgtest [11:56:08]: @@@@@@@@@@@@@@@@@@@@ test bed setup 280s Get:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease [110 kB] 280s Get:2 http://ftpmaster.internal/ubuntu oracular-proposed/main Sources [36.1 kB] 280s Get:3 http://ftpmaster.internal/ubuntu oracular-proposed/universe Sources [389 kB] 280s Get:4 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse Sources [2576 B] 280s Get:5 http://ftpmaster.internal/ubuntu oracular-proposed/restricted Sources [7052 B] 280s Get:6 http://ftpmaster.internal/ubuntu oracular-proposed/main amd64 Packages [53.8 kB] 280s Get:7 http://ftpmaster.internal/ubuntu oracular-proposed/main i386 Packages [38.2 kB] 280s Get:8 http://ftpmaster.internal/ubuntu oracular-proposed/restricted amd64 Packages [28.9 kB] 280s Get:9 http://ftpmaster.internal/ubuntu oracular-proposed/restricted i386 Packages [6732 B] 280s Get:10 http://ftpmaster.internal/ubuntu oracular-proposed/universe i386 Packages [137 kB] 280s Get:11 http://ftpmaster.internal/ubuntu oracular-proposed/universe amd64 Packages [316 kB] 280s Get:12 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse i386 Packages [3884 B] 280s Get:13 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse amd64 Packages [8364 B] 280s Fetched 1138 kB in 0s (4890 kB/s) 280s Reading package lists... 282s Reading package lists... 282s Building dependency tree... 282s Reading state information... 282s Calculating upgrade... 282s The following packages will be upgraded: 282s apt apt-utils libapt-pkg6.0t64 libldap-common libldap2 282s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 282s Need to get 2875 kB of archives. 282s After this operation, 11.3 kB of additional disk space will be used. 282s Get:1 http://ftpmaster.internal/ubuntu oracular/main amd64 libapt-pkg6.0t64 amd64 2.9.5 [1023 kB] 282s Get:2 http://ftpmaster.internal/ubuntu oracular/main amd64 apt amd64 2.9.5 [1403 kB] 282s Get:3 http://ftpmaster.internal/ubuntu oracular/main amd64 apt-utils amd64 2.9.5 [223 kB] 282s Get:4 http://ftpmaster.internal/ubuntu oracular/main amd64 libldap-common all 2.6.7+dfsg-1~exp1ubuntu9 [31.5 kB] 282s Get:5 http://ftpmaster.internal/ubuntu oracular/main amd64 libldap2 amd64 2.6.7+dfsg-1~exp1ubuntu9 [195 kB] 283s Fetched 2875 kB in 0s (37.1 MB/s) 283s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 283s Preparing to unpack .../libapt-pkg6.0t64_2.9.5_amd64.deb ... 283s Unpacking libapt-pkg6.0t64:amd64 (2.9.5) over (2.9.3) ... 283s Setting up libapt-pkg6.0t64:amd64 (2.9.5) ... 283s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 283s Preparing to unpack .../archives/apt_2.9.5_amd64.deb ... 283s Unpacking apt (2.9.5) over (2.9.3) ... 283s Setting up apt (2.9.5) ... 284s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 284s Preparing to unpack .../apt-utils_2.9.5_amd64.deb ... 284s Unpacking apt-utils (2.9.5) over (2.9.3) ... 284s Preparing to unpack .../libldap-common_2.6.7+dfsg-1~exp1ubuntu9_all.deb ... 284s Unpacking libldap-common (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 284s Preparing to unpack .../libldap2_2.6.7+dfsg-1~exp1ubuntu9_amd64.deb ... 284s Unpacking libldap2:amd64 (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 284s Setting up apt-utils (2.9.5) ... 284s Setting up libldap-common (2.6.7+dfsg-1~exp1ubuntu9) ... 284s Setting up libldap2:amd64 (2.6.7+dfsg-1~exp1ubuntu9) ... 284s Processing triggers for man-db (2.12.1-2) ... 286s Processing triggers for libc-bin (2.39-0ubuntu9) ... 287s Reading package lists... 287s Building dependency tree... 287s Reading state information... 287s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 288s Hit:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease 288s Hit:2 http://ftpmaster.internal/ubuntu oracular InRelease 288s Hit:3 http://ftpmaster.internal/ubuntu oracular-updates InRelease 288s Hit:4 http://ftpmaster.internal/ubuntu oracular-security InRelease 289s Reading package lists... 289s Reading package lists... 289s Building dependency tree... 289s Reading state information... 289s Calculating upgrade... 290s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 290s Reading package lists... 290s Building dependency tree... 290s Reading state information... 290s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 290s autopkgtest [11:56:18]: rebooting testbed after setup commands that affected boot 294s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 304s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 309s Reading package lists... 309s Building dependency tree... 309s Reading state information... 310s Starting pkgProblemResolver with broken count: 0 310s Starting 2 pkgProblemResolver with broken count: 0 310s Done 310s The following additional packages will be installed: 310s fonts-font-awesome fonts-glyphicons-halflings fonts-mathjax gdb 310s libbabeltrace1 libdebuginfod-common libdebuginfod1t64 libipt2 libjs-backbone 310s libjs-bootstrap libjs-bootstrap-tour libjs-codemirror libjs-es6-promise 310s libjs-jed libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 310s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 310s libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 libpgm-5.3-0t64 310s libpython3.12t64 libsodium23 libsource-highlight-common 310s libsource-highlight4t64 libzmq5 node-jed python-tinycss2-common python3-all 310s python3-argon2 python3-asttokens python3-bleach python3-bs4 python3-bytecode 310s python3-comm python3-coverage python3-dateutil python3-debugpy 310s python3-decorator python3-defusedxml python3-entrypoints python3-executing 310s python3-fastjsonschema python3-html5lib python3-ipykernel python3-ipython 310s python3-ipython-genutils python3-jedi python3-jupyter-client 310s python3-jupyter-core python3-jupyterlab-pygments python3-matplotlib-inline 310s python3-mistune python3-nbclient python3-nbconvert python3-nbformat 310s python3-nest-asyncio python3-notebook python3-packaging 310s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 310s python3-prometheus-client python3-prompt-toolkit python3-psutil 310s python3-ptyprocess python3-pure-eval python3-py python3-pydevd 310s python3-send2trash python3-soupsieve python3-stack-data python3-terminado 310s python3-tinycss2 python3-tornado python3-traitlets python3-typeshed 310s python3-wcwidth python3-webencodings python3-zmq 310s Suggested packages: 310s gdb-doc gdbserver libjs-jquery-lazyload libjs-json libjs-jquery-ui-docs 310s fonts-mathjax-extras fonts-stix libjs-mathjax-doc python-argon2-doc 310s python-bleach-doc python-bytecode-doc python-coverage-doc 310s python-fastjsonschema-doc python3-genshi python3-lxml python-ipython-doc 310s python3-pip python-nbconvert-doc texlive-fonts-recommended 310s texlive-plain-generic texlive-xetex python-notebook-doc python-pexpect-doc 310s subversion python3-pytest pydevd python-terminado-doc python-tinycss2-doc 310s python3-pycurl python-tornado-doc python3-twisted 310s Recommended packages: 310s libc-dbg javascript-common python3-lxml python3-matplotlib pandoc 310s python3-ipywidgets 310s The following NEW packages will be installed: 310s autopkgtest-satdep fonts-font-awesome fonts-glyphicons-halflings 310s fonts-mathjax gdb libbabeltrace1 libdebuginfod-common libdebuginfod1t64 310s libipt2 libjs-backbone libjs-bootstrap libjs-bootstrap-tour libjs-codemirror 310s libjs-es6-promise libjs-jed libjs-jquery libjs-jquery-typeahead 310s libjs-jquery-ui libjs-marked libjs-mathjax libjs-moment libjs-requirejs 310s libjs-requirejs-text libjs-text-encoding libjs-underscore libjs-xterm 310s libnorm1t64 libpgm-5.3-0t64 libpython3.12t64 libsodium23 310s libsource-highlight-common libsource-highlight4t64 libzmq5 node-jed 310s python-tinycss2-common python3-all python3-argon2 python3-asttokens 310s python3-bleach python3-bs4 python3-bytecode python3-comm python3-coverage 310s python3-dateutil python3-debugpy python3-decorator python3-defusedxml 310s python3-entrypoints python3-executing python3-fastjsonschema 310s python3-html5lib python3-ipykernel python3-ipython python3-ipython-genutils 310s python3-jedi python3-jupyter-client python3-jupyter-core 310s python3-jupyterlab-pygments python3-matplotlib-inline python3-mistune 310s python3-nbclient python3-nbconvert python3-nbformat python3-nest-asyncio 310s python3-notebook python3-packaging python3-pandocfilters python3-parso 310s python3-pexpect python3-platformdirs python3-prometheus-client 310s python3-prompt-toolkit python3-psutil python3-ptyprocess python3-pure-eval 310s python3-py python3-pydevd python3-send2trash python3-soupsieve 310s python3-stack-data python3-terminado python3-tinycss2 python3-tornado 310s python3-traitlets python3-typeshed python3-wcwidth python3-webencodings 310s python3-zmq 310s 0 upgraded, 88 newly installed, 0 to remove and 0 not upgraded. 310s Need to get 26.6 MB/26.7 MB of archives. 310s After this operation, 150 MB of additional disk space will be used. 310s Get:1 /tmp/autopkgtest.iPdHX2/3-autopkgtest-satdep.deb autopkgtest-satdep amd64 0 [712 B] 310s Get:2 http://ftpmaster.internal/ubuntu oracular/main amd64 libdebuginfod-common all 0.191-1 [14.6 kB] 310s Get:3 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 310s Get:4 http://ftpmaster.internal/ubuntu oracular/universe amd64 fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-3 [118 kB] 310s Get:5 http://ftpmaster.internal/ubuntu oracular/main amd64 fonts-mathjax all 2.7.9+dfsg-1 [2208 kB] 310s Get:6 http://ftpmaster.internal/ubuntu oracular/main amd64 libbabeltrace1 amd64 1.5.11-3build3 [164 kB] 310s Get:7 http://ftpmaster.internal/ubuntu oracular/main amd64 libdebuginfod1t64 amd64 0.191-1 [17.1 kB] 310s Get:8 http://ftpmaster.internal/ubuntu oracular/main amd64 libipt2 amd64 2.0.6-1build1 [45.7 kB] 310s Get:9 http://ftpmaster.internal/ubuntu oracular/main amd64 libpython3.12t64 amd64 3.12.4-1 [2338 kB] 310s Get:10 http://ftpmaster.internal/ubuntu oracular/main amd64 libsource-highlight-common all 3.1.9-4.3build1 [64.2 kB] 310s Get:11 http://ftpmaster.internal/ubuntu oracular/main amd64 libsource-highlight4t64 amd64 3.1.9-4.3build1 [258 kB] 310s Get:12 http://ftpmaster.internal/ubuntu oracular/main amd64 gdb amd64 15.0.50.20240403-0ubuntu1 [4010 kB] 310s Get:13 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 310s Get:14 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-backbone all 1.4.1~dfsg+~1.4.15-3 [185 kB] 310s Get:15 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-bootstrap all 3.4.1+dfsg-3 [129 kB] 310s Get:16 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 310s Get:17 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-bootstrap-tour all 0.12.0+dfsg-5 [21.4 kB] 310s Get:18 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-es6-promise all 4.2.8-12 [14.1 kB] 310s Get:19 http://ftpmaster.internal/ubuntu oracular/universe amd64 node-jed all 1.1.1-4 [15.2 kB] 310s Get:20 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jed all 1.1.1-4 [2584 B] 310s Get:21 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jquery-typeahead all 2.11.0+dfsg1-3 [48.9 kB] 310s Get:22 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-jquery-ui all 1.13.2+dfsg-1 [252 kB] 310s Get:23 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-moment all 2.29.4+ds-1 [147 kB] 310s Get:24 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-text-encoding all 0.7.0-5 [140 kB] 310s Get:25 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-xterm all 5.3.0-2 [476 kB] 310s Get:26 http://ftpmaster.internal/ubuntu oracular/universe amd64 libnorm1t64 amd64 1.5.9+dfsg-3.1build1 [154 kB] 310s Get:27 http://ftpmaster.internal/ubuntu oracular/universe amd64 libpgm-5.3-0t64 amd64 5.3.128~dfsg-2.1build1 [167 kB] 310s Get:28 http://ftpmaster.internal/ubuntu oracular/main amd64 libsodium23 amd64 1.0.18-1build3 [161 kB] 310s Get:29 http://ftpmaster.internal/ubuntu oracular/universe amd64 libzmq5 amd64 4.3.5-1build2 [260 kB] 310s Get:30 http://ftpmaster.internal/ubuntu oracular/universe amd64 python-tinycss2-common all 1.3.0-1 [34.1 kB] 310s Get:31 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-all amd64 3.12.3-0ubuntu1 [888 B] 310s Get:32 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-argon2 amd64 21.1.0-2build1 [21.0 kB] 310s Get:33 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-asttokens all 2.4.1-1 [20.9 kB] 310s Get:34 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-webencodings all 0.5.1-5 [11.5 kB] 310s Get:35 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-html5lib all 1.1-6 [88.8 kB] 310s Get:36 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-bleach all 6.1.0-2 [49.6 kB] 310s Get:37 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-soupsieve all 2.5-1 [33.0 kB] 310s Get:38 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-bs4 all 4.12.3-1 [109 kB] 310s Get:39 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-bytecode all 0.15.1-3 [44.7 kB] 310s Get:40 http://ftpmaster.internal/ubuntu oracular-proposed/universe amd64 python3-traitlets all 5.14.3-1 [71.3 kB] 310s Get:41 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-comm all 0.2.1-1 [7016 B] 310s Get:42 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-coverage amd64 7.4.4+dfsg1-0ubuntu2 [147 kB] 310s Get:43 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-dateutil all 2.9.0-2 [80.3 kB] 310s Get:44 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pydevd amd64 2.10.0+ds-10ubuntu1 [637 kB] 310s Get:45 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-debugpy all 1.8.0+ds-4ubuntu4 [67.6 kB] 310s Get:46 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-decorator all 5.1.1-5 [10.1 kB] 310s Get:47 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-defusedxml all 0.7.1-2 [42.0 kB] 310s Get:48 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-entrypoints all 0.4-2 [7146 B] 310s Get:49 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-executing all 2.0.1-0.1 [23.3 kB] 310s Get:50 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-fastjsonschema all 2.19.1-1 [19.7 kB] 310s Get:51 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-parso all 0.8.3-1 [67.2 kB] 310s Get:52 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-typeshed all 0.0~git20231111.6764465-3 [1274 kB] 310s Get:53 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jedi all 0.19.1+ds1-1 [693 kB] 310s Get:54 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-matplotlib-inline all 0.1.6-2 [8784 B] 310s Get:55 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-ptyprocess all 0.7.0-5 [15.1 kB] 310s Get:56 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-pexpect all 4.9-2 [48.1 kB] 310s Get:57 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-wcwidth all 0.2.5+dfsg1-1.1ubuntu1 [22.5 kB] 310s Get:58 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-prompt-toolkit all 3.0.46-1 [256 kB] 310s Get:59 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pure-eval all 0.2.2-2 [11.1 kB] 310s Get:60 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-stack-data all 0.6.3-1 [22.0 kB] 310s Get:61 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipython all 8.20.0-1ubuntu1 [561 kB] 310s Get:62 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-platformdirs all 4.2.1-1 [16.3 kB] 310s Get:63 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyter-core all 5.3.2-2 [25.5 kB] 310s Get:64 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nest-asyncio all 1.5.4-1 [6256 B] 310s Get:65 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-tornado amd64 6.4.1-1 [298 kB] 310s Get:66 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-py all 1.11.0-2 [72.7 kB] 310s Get:67 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-zmq amd64 24.0.1-5build1 [286 kB] 310s Get:68 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyter-client all 7.4.9-2ubuntu1 [90.5 kB] 310s Get:69 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-packaging all 24.0-1 [41.1 kB] 310s Get:70 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-psutil amd64 5.9.8-2build2 [195 kB] 310s Get:71 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipykernel all 6.29.3-1ubuntu1 [82.6 kB] 310s Get:72 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-ipython-genutils all 0.2.0-6 [22.0 kB] 310s Get:73 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-jupyterlab-pygments all 0.2.2-3 [6054 B] 310s Get:74 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-mistune all 3.0.2-1 [32.8 kB] 310s Get:75 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbformat all 5.9.1-1 [41.2 kB] 310s Get:76 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbclient all 0.8.0-1 [55.6 kB] 310s Get:77 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-pandocfilters all 1.5.1-1 [23.6 kB] 310s Get:78 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-tinycss2 all 1.3.0-1 [19.6 kB] 310s Get:79 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-nbconvert all 7.16.4-1 [156 kB] 310s Get:80 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-codemirror all 5.65.0+~cs5.83.9-3 [755 kB] 310s Get:81 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-marked all 4.2.3+ds+~4.0.7-3 [36.2 kB] 310s Get:82 http://ftpmaster.internal/ubuntu oracular/main amd64 libjs-mathjax all 2.7.9+dfsg-1 [5665 kB] 310s Get:83 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-requirejs all 2.3.6+ds+~2.1.37-1 [201 kB] 310s Get:84 http://ftpmaster.internal/ubuntu oracular/universe amd64 libjs-requirejs-text all 2.0.12-1.1 [9056 B] 310s Get:85 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-terminado all 0.18.1-1 [13.2 kB] 310s Get:86 http://ftpmaster.internal/ubuntu oracular/main amd64 python3-prometheus-client all 0.19.0+ds1-1 [41.7 kB] 310s Get:87 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-send2trash all 1.8.2-1 [15.5 kB] 310s Get:88 http://ftpmaster.internal/ubuntu oracular/universe amd64 python3-notebook all 6.4.12-2.2ubuntu1 [1566 kB] 311s Preconfiguring packages ... 311s Fetched 26.6 MB in 0s (100 MB/s) 311s Selecting previously unselected package libdebuginfod-common. 311s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 74430 files and directories currently installed.) 311s Preparing to unpack .../00-libdebuginfod-common_0.191-1_all.deb ... 311s Unpacking libdebuginfod-common (0.191-1) ... 311s Selecting previously unselected package fonts-font-awesome. 311s Preparing to unpack .../01-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 311s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 311s Selecting previously unselected package fonts-glyphicons-halflings. 311s Preparing to unpack .../02-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-3_all.deb ... 311s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 311s Selecting previously unselected package fonts-mathjax. 311s Preparing to unpack .../03-fonts-mathjax_2.7.9+dfsg-1_all.deb ... 311s Unpacking fonts-mathjax (2.7.9+dfsg-1) ... 312s Selecting previously unselected package libbabeltrace1:amd64. 312s Preparing to unpack .../04-libbabeltrace1_1.5.11-3build3_amd64.deb ... 312s Unpacking libbabeltrace1:amd64 (1.5.11-3build3) ... 312s Selecting previously unselected package libdebuginfod1t64:amd64. 312s Preparing to unpack .../05-libdebuginfod1t64_0.191-1_amd64.deb ... 312s Unpacking libdebuginfod1t64:amd64 (0.191-1) ... 312s Selecting previously unselected package libipt2. 312s Preparing to unpack .../06-libipt2_2.0.6-1build1_amd64.deb ... 312s Unpacking libipt2 (2.0.6-1build1) ... 312s Selecting previously unselected package libpython3.12t64:amd64. 312s Preparing to unpack .../07-libpython3.12t64_3.12.4-1_amd64.deb ... 312s Unpacking libpython3.12t64:amd64 (3.12.4-1) ... 312s Selecting previously unselected package libsource-highlight-common. 312s Preparing to unpack .../08-libsource-highlight-common_3.1.9-4.3build1_all.deb ... 312s Unpacking libsource-highlight-common (3.1.9-4.3build1) ... 312s Selecting previously unselected package libsource-highlight4t64:amd64. 312s Preparing to unpack .../09-libsource-highlight4t64_3.1.9-4.3build1_amd64.deb ... 312s Unpacking libsource-highlight4t64:amd64 (3.1.9-4.3build1) ... 312s Selecting previously unselected package gdb. 312s Preparing to unpack .../10-gdb_15.0.50.20240403-0ubuntu1_amd64.deb ... 312s Unpacking gdb (15.0.50.20240403-0ubuntu1) ... 312s Selecting previously unselected package libjs-underscore. 312s Preparing to unpack .../11-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 312s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 312s Selecting previously unselected package libjs-backbone. 312s Preparing to unpack .../12-libjs-backbone_1.4.1~dfsg+~1.4.15-3_all.deb ... 312s Unpacking libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 312s Selecting previously unselected package libjs-bootstrap. 312s Preparing to unpack .../13-libjs-bootstrap_3.4.1+dfsg-3_all.deb ... 312s Unpacking libjs-bootstrap (3.4.1+dfsg-3) ... 312s Selecting previously unselected package libjs-jquery. 312s Preparing to unpack .../14-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 312s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 312s Selecting previously unselected package libjs-bootstrap-tour. 312s Preparing to unpack .../15-libjs-bootstrap-tour_0.12.0+dfsg-5_all.deb ... 312s Unpacking libjs-bootstrap-tour (0.12.0+dfsg-5) ... 312s Selecting previously unselected package libjs-es6-promise. 312s Preparing to unpack .../16-libjs-es6-promise_4.2.8-12_all.deb ... 312s Unpacking libjs-es6-promise (4.2.8-12) ... 312s Selecting previously unselected package node-jed. 312s Preparing to unpack .../17-node-jed_1.1.1-4_all.deb ... 312s Unpacking node-jed (1.1.1-4) ... 312s Selecting previously unselected package libjs-jed. 312s Preparing to unpack .../18-libjs-jed_1.1.1-4_all.deb ... 312s Unpacking libjs-jed (1.1.1-4) ... 312s Selecting previously unselected package libjs-jquery-typeahead. 312s Preparing to unpack .../19-libjs-jquery-typeahead_2.11.0+dfsg1-3_all.deb ... 312s Unpacking libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 312s Selecting previously unselected package libjs-jquery-ui. 312s Preparing to unpack .../20-libjs-jquery-ui_1.13.2+dfsg-1_all.deb ... 312s Unpacking libjs-jquery-ui (1.13.2+dfsg-1) ... 312s Selecting previously unselected package libjs-moment. 312s Preparing to unpack .../21-libjs-moment_2.29.4+ds-1_all.deb ... 312s Unpacking libjs-moment (2.29.4+ds-1) ... 312s Selecting previously unselected package libjs-text-encoding. 312s Preparing to unpack .../22-libjs-text-encoding_0.7.0-5_all.deb ... 312s Unpacking libjs-text-encoding (0.7.0-5) ... 312s Selecting previously unselected package libjs-xterm. 312s Preparing to unpack .../23-libjs-xterm_5.3.0-2_all.deb ... 312s Unpacking libjs-xterm (5.3.0-2) ... 312s Selecting previously unselected package libnorm1t64:amd64. 312s Preparing to unpack .../24-libnorm1t64_1.5.9+dfsg-3.1build1_amd64.deb ... 312s Unpacking libnorm1t64:amd64 (1.5.9+dfsg-3.1build1) ... 312s Selecting previously unselected package libpgm-5.3-0t64:amd64. 312s Preparing to unpack .../25-libpgm-5.3-0t64_5.3.128~dfsg-2.1build1_amd64.deb ... 312s Unpacking libpgm-5.3-0t64:amd64 (5.3.128~dfsg-2.1build1) ... 312s Selecting previously unselected package libsodium23:amd64. 312s Preparing to unpack .../26-libsodium23_1.0.18-1build3_amd64.deb ... 312s Unpacking libsodium23:amd64 (1.0.18-1build3) ... 312s Selecting previously unselected package libzmq5:amd64. 312s Preparing to unpack .../27-libzmq5_4.3.5-1build2_amd64.deb ... 312s Unpacking libzmq5:amd64 (4.3.5-1build2) ... 313s Selecting previously unselected package python-tinycss2-common. 313s Preparing to unpack .../28-python-tinycss2-common_1.3.0-1_all.deb ... 313s Unpacking python-tinycss2-common (1.3.0-1) ... 313s Selecting previously unselected package python3-all. 313s Preparing to unpack .../29-python3-all_3.12.3-0ubuntu1_amd64.deb ... 313s Unpacking python3-all (3.12.3-0ubuntu1) ... 313s Selecting previously unselected package python3-argon2. 313s Preparing to unpack .../30-python3-argon2_21.1.0-2build1_amd64.deb ... 313s Unpacking python3-argon2 (21.1.0-2build1) ... 313s Selecting previously unselected package python3-asttokens. 313s Preparing to unpack .../31-python3-asttokens_2.4.1-1_all.deb ... 313s Unpacking python3-asttokens (2.4.1-1) ... 313s Selecting previously unselected package python3-webencodings. 313s Preparing to unpack .../32-python3-webencodings_0.5.1-5_all.deb ... 313s Unpacking python3-webencodings (0.5.1-5) ... 313s Selecting previously unselected package python3-html5lib. 313s Preparing to unpack .../33-python3-html5lib_1.1-6_all.deb ... 313s Unpacking python3-html5lib (1.1-6) ... 313s Selecting previously unselected package python3-bleach. 313s Preparing to unpack .../34-python3-bleach_6.1.0-2_all.deb ... 313s Unpacking python3-bleach (6.1.0-2) ... 313s Selecting previously unselected package python3-soupsieve. 313s Preparing to unpack .../35-python3-soupsieve_2.5-1_all.deb ... 313s Unpacking python3-soupsieve (2.5-1) ... 313s Selecting previously unselected package python3-bs4. 313s Preparing to unpack .../36-python3-bs4_4.12.3-1_all.deb ... 313s Unpacking python3-bs4 (4.12.3-1) ... 313s Selecting previously unselected package python3-bytecode. 313s Preparing to unpack .../37-python3-bytecode_0.15.1-3_all.deb ... 313s Unpacking python3-bytecode (0.15.1-3) ... 313s Selecting previously unselected package python3-traitlets. 313s Preparing to unpack .../38-python3-traitlets_5.14.3-1_all.deb ... 313s Unpacking python3-traitlets (5.14.3-1) ... 313s Selecting previously unselected package python3-comm. 313s Preparing to unpack .../39-python3-comm_0.2.1-1_all.deb ... 313s Unpacking python3-comm (0.2.1-1) ... 313s Selecting previously unselected package python3-coverage. 313s Preparing to unpack .../40-python3-coverage_7.4.4+dfsg1-0ubuntu2_amd64.deb ... 313s Unpacking python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 313s Selecting previously unselected package python3-dateutil. 313s Preparing to unpack .../41-python3-dateutil_2.9.0-2_all.deb ... 313s Unpacking python3-dateutil (2.9.0-2) ... 313s Selecting previously unselected package python3-pydevd. 313s Preparing to unpack .../42-python3-pydevd_2.10.0+ds-10ubuntu1_amd64.deb ... 313s Unpacking python3-pydevd (2.10.0+ds-10ubuntu1) ... 313s Selecting previously unselected package python3-debugpy. 313s Preparing to unpack .../43-python3-debugpy_1.8.0+ds-4ubuntu4_all.deb ... 313s Unpacking python3-debugpy (1.8.0+ds-4ubuntu4) ... 313s Selecting previously unselected package python3-decorator. 313s Preparing to unpack .../44-python3-decorator_5.1.1-5_all.deb ... 313s Unpacking python3-decorator (5.1.1-5) ... 313s Selecting previously unselected package python3-defusedxml. 313s Preparing to unpack .../45-python3-defusedxml_0.7.1-2_all.deb ... 313s Unpacking python3-defusedxml (0.7.1-2) ... 313s Selecting previously unselected package python3-entrypoints. 313s Preparing to unpack .../46-python3-entrypoints_0.4-2_all.deb ... 313s Unpacking python3-entrypoints (0.4-2) ... 313s Selecting previously unselected package python3-executing. 313s Preparing to unpack .../47-python3-executing_2.0.1-0.1_all.deb ... 313s Unpacking python3-executing (2.0.1-0.1) ... 313s Selecting previously unselected package python3-fastjsonschema. 313s Preparing to unpack .../48-python3-fastjsonschema_2.19.1-1_all.deb ... 313s Unpacking python3-fastjsonschema (2.19.1-1) ... 313s Selecting previously unselected package python3-parso. 313s Preparing to unpack .../49-python3-parso_0.8.3-1_all.deb ... 313s Unpacking python3-parso (0.8.3-1) ... 313s Selecting previously unselected package python3-typeshed. 313s Preparing to unpack .../50-python3-typeshed_0.0~git20231111.6764465-3_all.deb ... 313s Unpacking python3-typeshed (0.0~git20231111.6764465-3) ... 314s Selecting previously unselected package python3-jedi. 314s Preparing to unpack .../51-python3-jedi_0.19.1+ds1-1_all.deb ... 314s Unpacking python3-jedi (0.19.1+ds1-1) ... 314s Selecting previously unselected package python3-matplotlib-inline. 314s Preparing to unpack .../52-python3-matplotlib-inline_0.1.6-2_all.deb ... 314s Unpacking python3-matplotlib-inline (0.1.6-2) ... 314s Selecting previously unselected package python3-ptyprocess. 314s Preparing to unpack .../53-python3-ptyprocess_0.7.0-5_all.deb ... 314s Unpacking python3-ptyprocess (0.7.0-5) ... 314s Selecting previously unselected package python3-pexpect. 314s Preparing to unpack .../54-python3-pexpect_4.9-2_all.deb ... 314s Unpacking python3-pexpect (4.9-2) ... 314s Selecting previously unselected package python3-wcwidth. 314s Preparing to unpack .../55-python3-wcwidth_0.2.5+dfsg1-1.1ubuntu1_all.deb ... 314s Unpacking python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 314s Selecting previously unselected package python3-prompt-toolkit. 314s Preparing to unpack .../56-python3-prompt-toolkit_3.0.46-1_all.deb ... 314s Unpacking python3-prompt-toolkit (3.0.46-1) ... 314s Selecting previously unselected package python3-pure-eval. 314s Preparing to unpack .../57-python3-pure-eval_0.2.2-2_all.deb ... 314s Unpacking python3-pure-eval (0.2.2-2) ... 314s Selecting previously unselected package python3-stack-data. 314s Preparing to unpack .../58-python3-stack-data_0.6.3-1_all.deb ... 314s Unpacking python3-stack-data (0.6.3-1) ... 314s Selecting previously unselected package python3-ipython. 314s Preparing to unpack .../59-python3-ipython_8.20.0-1ubuntu1_all.deb ... 314s Unpacking python3-ipython (8.20.0-1ubuntu1) ... 314s Selecting previously unselected package python3-platformdirs. 314s Preparing to unpack .../60-python3-platformdirs_4.2.1-1_all.deb ... 314s Unpacking python3-platformdirs (4.2.1-1) ... 314s Selecting previously unselected package python3-jupyter-core. 314s Preparing to unpack .../61-python3-jupyter-core_5.3.2-2_all.deb ... 314s Unpacking python3-jupyter-core (5.3.2-2) ... 314s Selecting previously unselected package python3-nest-asyncio. 314s Preparing to unpack .../62-python3-nest-asyncio_1.5.4-1_all.deb ... 314s Unpacking python3-nest-asyncio (1.5.4-1) ... 314s Selecting previously unselected package python3-tornado. 314s Preparing to unpack .../63-python3-tornado_6.4.1-1_amd64.deb ... 314s Unpacking python3-tornado (6.4.1-1) ... 314s Selecting previously unselected package python3-py. 314s Preparing to unpack .../64-python3-py_1.11.0-2_all.deb ... 314s Unpacking python3-py (1.11.0-2) ... 315s Selecting previously unselected package python3-zmq. 315s Preparing to unpack .../65-python3-zmq_24.0.1-5build1_amd64.deb ... 315s Unpacking python3-zmq (24.0.1-5build1) ... 315s Selecting previously unselected package python3-jupyter-client. 315s Preparing to unpack .../66-python3-jupyter-client_7.4.9-2ubuntu1_all.deb ... 315s Unpacking python3-jupyter-client (7.4.9-2ubuntu1) ... 315s Selecting previously unselected package python3-packaging. 315s Preparing to unpack .../67-python3-packaging_24.0-1_all.deb ... 315s Unpacking python3-packaging (24.0-1) ... 315s Selecting previously unselected package python3-psutil. 315s Preparing to unpack .../68-python3-psutil_5.9.8-2build2_amd64.deb ... 315s Unpacking python3-psutil (5.9.8-2build2) ... 315s Selecting previously unselected package python3-ipykernel. 315s Preparing to unpack .../69-python3-ipykernel_6.29.3-1ubuntu1_all.deb ... 315s Unpacking python3-ipykernel (6.29.3-1ubuntu1) ... 315s Selecting previously unselected package python3-ipython-genutils. 315s Preparing to unpack .../70-python3-ipython-genutils_0.2.0-6_all.deb ... 315s Unpacking python3-ipython-genutils (0.2.0-6) ... 315s Selecting previously unselected package python3-jupyterlab-pygments. 315s Preparing to unpack .../71-python3-jupyterlab-pygments_0.2.2-3_all.deb ... 315s Unpacking python3-jupyterlab-pygments (0.2.2-3) ... 315s Selecting previously unselected package python3-mistune. 315s Preparing to unpack .../72-python3-mistune_3.0.2-1_all.deb ... 315s Unpacking python3-mistune (3.0.2-1) ... 315s Selecting previously unselected package python3-nbformat. 315s Preparing to unpack .../73-python3-nbformat_5.9.1-1_all.deb ... 315s Unpacking python3-nbformat (5.9.1-1) ... 315s Selecting previously unselected package python3-nbclient. 315s Preparing to unpack .../74-python3-nbclient_0.8.0-1_all.deb ... 315s Unpacking python3-nbclient (0.8.0-1) ... 315s Selecting previously unselected package python3-pandocfilters. 315s Preparing to unpack .../75-python3-pandocfilters_1.5.1-1_all.deb ... 315s Unpacking python3-pandocfilters (1.5.1-1) ... 315s Selecting previously unselected package python3-tinycss2. 315s Preparing to unpack .../76-python3-tinycss2_1.3.0-1_all.deb ... 315s Unpacking python3-tinycss2 (1.3.0-1) ... 315s Selecting previously unselected package python3-nbconvert. 315s Preparing to unpack .../77-python3-nbconvert_7.16.4-1_all.deb ... 315s Unpacking python3-nbconvert (7.16.4-1) ... 315s Selecting previously unselected package libjs-codemirror. 315s Preparing to unpack .../78-libjs-codemirror_5.65.0+~cs5.83.9-3_all.deb ... 315s Unpacking libjs-codemirror (5.65.0+~cs5.83.9-3) ... 315s Selecting previously unselected package libjs-marked. 315s Preparing to unpack .../79-libjs-marked_4.2.3+ds+~4.0.7-3_all.deb ... 315s Unpacking libjs-marked (4.2.3+ds+~4.0.7-3) ... 315s Selecting previously unselected package libjs-mathjax. 315s Preparing to unpack .../80-libjs-mathjax_2.7.9+dfsg-1_all.deb ... 315s Unpacking libjs-mathjax (2.7.9+dfsg-1) ... 316s Selecting previously unselected package libjs-requirejs. 316s Preparing to unpack .../81-libjs-requirejs_2.3.6+ds+~2.1.37-1_all.deb ... 316s Unpacking libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 316s Selecting previously unselected package libjs-requirejs-text. 316s Preparing to unpack .../82-libjs-requirejs-text_2.0.12-1.1_all.deb ... 316s Unpacking libjs-requirejs-text (2.0.12-1.1) ... 316s Selecting previously unselected package python3-terminado. 316s Preparing to unpack .../83-python3-terminado_0.18.1-1_all.deb ... 316s Unpacking python3-terminado (0.18.1-1) ... 316s Selecting previously unselected package python3-prometheus-client. 316s Preparing to unpack .../84-python3-prometheus-client_0.19.0+ds1-1_all.deb ... 316s Unpacking python3-prometheus-client (0.19.0+ds1-1) ... 316s Selecting previously unselected package python3-send2trash. 316s Preparing to unpack .../85-python3-send2trash_1.8.2-1_all.deb ... 316s Unpacking python3-send2trash (1.8.2-1) ... 316s Selecting previously unselected package python3-notebook. 316s Preparing to unpack .../86-python3-notebook_6.4.12-2.2ubuntu1_all.deb ... 316s Unpacking python3-notebook (6.4.12-2.2ubuntu1) ... 316s Selecting previously unselected package autopkgtest-satdep. 316s Preparing to unpack .../87-3-autopkgtest-satdep.deb ... 316s Unpacking autopkgtest-satdep (0) ... 316s Setting up python3-entrypoints (0.4-2) ... 316s Setting up libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 316s Setting up python3-tornado (6.4.1-1) ... 317s Setting up libnorm1t64:amd64 (1.5.9+dfsg-3.1build1) ... 317s Setting up python3-pure-eval (0.2.2-2) ... 317s Setting up python3-send2trash (1.8.2-1) ... 317s Setting up fonts-mathjax (2.7.9+dfsg-1) ... 317s Setting up libsodium23:amd64 (1.0.18-1build3) ... 317s Setting up libjs-mathjax (2.7.9+dfsg-1) ... 317s Setting up python3-py (1.11.0-2) ... 317s Setting up libdebuginfod-common (0.191-1) ... 317s Setting up libjs-requirejs-text (2.0.12-1.1) ... 317s Setting up python3-parso (0.8.3-1) ... 318s Setting up python3-defusedxml (0.7.1-2) ... 318s Setting up python3-ipython-genutils (0.2.0-6) ... 318s Setting up python3-asttokens (2.4.1-1) ... 318s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 318s Setting up python3-all (3.12.3-0ubuntu1) ... 318s Setting up python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 318s Setting up libjs-moment (2.29.4+ds-1) ... 318s Setting up python3-pandocfilters (1.5.1-1) ... 318s Setting up libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 318s Setting up libjs-es6-promise (4.2.8-12) ... 318s Setting up libjs-text-encoding (0.7.0-5) ... 318s Setting up python3-webencodings (0.5.1-5) ... 318s Setting up python3-platformdirs (4.2.1-1) ... 318s Setting up python3-psutil (5.9.8-2build2) ... 319s Setting up libsource-highlight-common (3.1.9-4.3build1) ... 319s Setting up python3-jupyterlab-pygments (0.2.2-3) ... 319s Setting up libpython3.12t64:amd64 (3.12.4-1) ... 319s Setting up libpgm-5.3-0t64:amd64 (5.3.128~dfsg-2.1build1) ... 319s Setting up python3-decorator (5.1.1-5) ... 319s Setting up python3-packaging (24.0-1) ... 319s Setting up python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 319s Setting up node-jed (1.1.1-4) ... 319s Setting up python3-typeshed (0.0~git20231111.6764465-3) ... 319s Setting up python3-executing (2.0.1-0.1) ... 319s Setting up libjs-xterm (5.3.0-2) ... 319s Setting up python3-nest-asyncio (1.5.4-1) ... 319s Setting up python3-bytecode (0.15.1-3) ... 320s Setting up libjs-codemirror (5.65.0+~cs5.83.9-3) ... 320s Setting up libjs-jed (1.1.1-4) ... 320s Setting up libipt2 (2.0.6-1build1) ... 320s Setting up python3-html5lib (1.1-6) ... 320s Setting up libbabeltrace1:amd64 (1.5.11-3build3) ... 320s Setting up python3-fastjsonschema (2.19.1-1) ... 320s Setting up python3-traitlets (5.14.3-1) ... 320s Setting up python-tinycss2-common (1.3.0-1) ... 320s Setting up python3-argon2 (21.1.0-2build1) ... 320s Setting up python3-dateutil (2.9.0-2) ... 320s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 320s Setting up python3-mistune (3.0.2-1) ... 321s Setting up python3-stack-data (0.6.3-1) ... 321s Setting up python3-soupsieve (2.5-1) ... 321s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 321s Setting up python3-jupyter-core (5.3.2-2) ... 321s Setting up libjs-bootstrap (3.4.1+dfsg-3) ... 321s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 321s Setting up python3-ptyprocess (0.7.0-5) ... 321s Setting up libjs-marked (4.2.3+ds+~4.0.7-3) ... 321s Setting up python3-prompt-toolkit (3.0.46-1) ... 321s Setting up libdebuginfod1t64:amd64 (0.191-1) ... 321s Setting up python3-tinycss2 (1.3.0-1) ... 322s Setting up libzmq5:amd64 (4.3.5-1build2) ... 322s Setting up python3-jedi (0.19.1+ds1-1) ... 322s Setting up libjs-bootstrap-tour (0.12.0+dfsg-5) ... 322s Setting up libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 322s Setting up libsource-highlight4t64:amd64 (3.1.9-4.3build1) ... 322s Setting up python3-nbformat (5.9.1-1) ... 322s Setting up python3-bs4 (4.12.3-1) ... 322s Setting up python3-bleach (6.1.0-2) ... 322s Setting up python3-matplotlib-inline (0.1.6-2) ... 322s Setting up python3-comm (0.2.1-1) ... 323s Setting up python3-prometheus-client (0.19.0+ds1-1) ... 323s Setting up gdb (15.0.50.20240403-0ubuntu1) ... 323s Setting up libjs-jquery-ui (1.13.2+dfsg-1) ... 323s Setting up python3-pexpect (4.9-2) ... 323s Setting up python3-zmq (24.0.1-5build1) ... 323s Setting up python3-terminado (0.18.1-1) ... 323s Setting up python3-jupyter-client (7.4.9-2ubuntu1) ... 323s Setting up python3-pydevd (2.10.0+ds-10ubuntu1) ... 324s Setting up python3-debugpy (1.8.0+ds-4ubuntu4) ... 324s Setting up python3-nbclient (0.8.0-1) ... 324s Setting up python3-ipython (8.20.0-1ubuntu1) ... 325s Setting up python3-ipykernel (6.29.3-1ubuntu1) ... 325s Setting up python3-nbconvert (7.16.4-1) ... 325s Setting up python3-notebook (6.4.12-2.2ubuntu1) ... 326s Setting up autopkgtest-satdep (0) ... 326s Processing triggers for man-db (2.12.1-2) ... 326s Processing triggers for libc-bin (2.39-0ubuntu9) ... 330s (Reading database ... 90638 files and directories currently installed.) 330s Removing autopkgtest-satdep (0) ... 331s autopkgtest [11:56:59]: test autodep8-python3: set -e ; for py in $(py3versions -r 2>/dev/null) ; do cd "$AUTOPKGTEST_TMP" ; echo "Testing with $py:" ; $py -c "import notebook; print(notebook)" ; done 331s autopkgtest [11:56:59]: test autodep8-python3: [----------------------- 331s Testing with python3.12: 331s 331s autopkgtest [11:56:59]: test autodep8-python3: -----------------------] 332s autodep8-python3 PASS (superficial) 332s autopkgtest [11:57:00]: test autodep8-python3: - - - - - - - - - - results - - - - - - - - - - 332s autopkgtest [11:57:00]: @@@@@@@@@@@@@@@@@@@@ summary 332s pytest FAIL non-zero exit status 1 332s command1 PASS (superficial) 332s autodep8-python3 PASS (superficial) 341s nova [W] Skipping flock for amd64 341s Creating nova instance adt-oracular-amd64-jupyter-notebook-20240616-102220-juju-7f2275-prod-proposed-migration-environment-2-e4388c01-ca29-4dd9-9535-fafa0350e03b from image adt/ubuntu-oracular-amd64-server-20240615.img (UUID 94fd7b6f-4495-4684-bd31-1d1deeaa1788)... 341s nova [W] Skipping flock for amd64 341s Creating nova instance adt-oracular-amd64-jupyter-notebook-20240616-102220-juju-7f2275-prod-proposed-migration-environment-2-e4388c01-ca29-4dd9-9535-fafa0350e03b from image adt/ubuntu-oracular-amd64-server-20240615.img (UUID 94fd7b6f-4495-4684-bd31-1d1deeaa1788)... 341s nova [W] Skipping flock for amd64 341s Creating nova instance adt-oracular-amd64-jupyter-notebook-20240616-102220-juju-7f2275-prod-proposed-migration-environment-2-e4388c01-ca29-4dd9-9535-fafa0350e03b from image adt/ubuntu-oracular-amd64-server-20240615.img (UUID 94fd7b6f-4495-4684-bd31-1d1deeaa1788)...